American Board of Internal Medicine
American Board of Internal Medicine Internship
Summer 2021 Psychometric Internship Announcement
The ABIM is pleased to announce the return of its summer internship in psychometrics.
The ABIM’s psychometric internship program is an eight-week summer internship running from Monday, June 14th to Friday, August 6th in Philadelphia, PA*. During the program, the intern will take primary ownership of an applied psychometric research project under the guidance of one of the ABIM’s measurement scientists. The intern will also have opportunities to assist psychometric staff on other research projects and to learn about operational processes (e.g., item analysis, IRT calibration, equating) within the context of medical certification.
* If necessary for health and safety reasons, the ABIM summer internship will be conducted remotely. ABIM will communicate with candidates about such decisions in the spring.
† If the internship is conducted remotely, interns will not receive the housing allowance.
- Doctoral student in an educational measurement (or related field) program with at least two years of coursework completed by the start of the internship
- Preference will be given to applicants who have experience with item response theory
- Excellent communication skills
- Interest in certification testing
- Eligible to be legally employed in the United States
The ABIM provides a total of $10,000 for the eight-week internship program. This total includes an $8,000 stipend as well as a $2,000 housing allowance†.
For their primary research project, the intern should expect to perform all stages of the research process, from literature review to discussion and dissemination of results. At the conclusion of the program, the intern will be expected to share their results by giving a brief presentation to an audience of psychometric staff. Further, the intern will be encouraged to submit their summer project(s) for presentation at a professional conference and/or for publication. The intern will work with their mentor to select an appropriate project for their experience level and interests. Examples of previously completed internship projects can be found on the next page.
Please submit your curriculum vitae and a letter of interest to Michele Johnson, Research Program Manager (email@example.com) by Monday, February 1st 2021.
Examples of Previously Completed Internship Projects
Please note these are examples of previous ABIM Interns’ projects. The summer 2021 internship project will be determined jointly by the 2021 mentor and summer intern.
- Anchor Item Replacement in the Presence of Consequential Item Parameter Drift. The purpose of this project was to develop evidence-based recommendations for replacing anchor items when a significant number are flagged for exhibiting item parameter drift. The intern conducted a simulation study that investigated different item replacement strategies, and examined the effects of each strategy with respect to outcomes such as pass/fail classification accuracy and RSMD of thetas.
Chang, K., & Rewley, K. Currently under review.
- Evaluating Use of an Online Open-Book Resource in a High Stakes Credentialing Exam. This project examined item and examinee characteristics associated with the use of an open-book resource throughout a high-stakes medical certification exam. Using exam process data and a generalized estimation equations modeling framework, the intern examined use of the open-book resource and how it might affect examinees’ test-taking experience and performance.
Myers, A. & Bashkov B. Paper presented at NCME 2020.
- Investigating the Impact of Parameter Instability on IRT Proficiency Estimation. This project examined how poorly estimated item parameters impact different proficiency estimators. The intern conducted a simulation study to examine how different levels of parameter instability impact Bayesian vs. non-Bayesian estimators as well as pattern vs. summed-score estimators.
McGrath, K. & Smiley, W. Paper presented at NCME 2019.
- Using Data Visualization to Explore Test Speededness in Certification Exams. This project examined different ways to determine if a test is speeded. The intern conducted a thorough literature review of methods used to detect and quantify test speededness. She then used data visualization, a nonparametric tool that does not have any data assumptions, to examine operational test data for speededness. This was shown to be a viable approach to assessing the impact of examination timing.
Sullivan, M. & Bashkov, B. Paper presented at TIME 2017.
- Automatic Flagging of Items for Key Validation. This project developed an automatic method for determining if there is a problem with an item’s key. The intern collected data from psychometricians regarding which items required key validation and used logistic regression to mimic that professional judgement to automatically flag problematic items.
Sahin, F. & Clauser, J. Paper presented at NCME 2016
National Board of Medical Examiners
Educational Testing Service I (ETS)
Educational Testing Service II (ETS)
Educational Testing Service III (ETS)
Human Resources Research Organization
The National Commission on Certification of Physician Assistants (NCCPA)
Position Summary: The National Commission on Certification of Physician Assistants (NCCPA) is offering an eight-week internship for students currently working toward their Ph.D. in psychometrics (or other relevant fields), with at least two years of graduate coursework. During the program, the intern will have the opportunity to gain experience in operational psychometric tasks involved in administering and scoring a certification assessment and to take the lead on a research project, in collaboration with psychometric staff. The research effort will include submission of a proposal to NCME, AERA, or similar conference, and will culminate in a research paper that can be delivered at that conference.
Candidates will undertake a project that meets their interests and skills and supports NCCPA’s exam-related research agenda. Current topics of interest to NCCPA revolve largely around our recently concluded, two-year, longitudinal assessment pilot. Specific areas of interest include automated item generation (AIG), multi-stage testing (MST), enemy item identification using machine learning methods, differential item functioning (DIF), small-sample equating, and standard setting.
This program is scheduled to begin in June and to conclude by the end of August. While interns will work remotely, during week one of the internship, they will work closely with the Psychometric team to develop a research plan for the remaining weeks. This initial project development week may be at the NCCPA offices in Johns Creek, GA, conditions permitting. In the event of an on-site experience, travel and lodging expenses will be covered by NCCPA. Over the following weeks, interns will convene in virtual weekly meetings with a mentor to discuss progress and to address questions. All finalized deliverables will be provided to NCCPA at the completion of the internship.
Application Procedure: A complete application includes a curriculum vitae, student copy of graduate school transcript (does not need to be an official transcript), two letters of recommendation, and a statement of purpose describing your interest in the internship as well as your general research interests. Application materials may be emailed or mailed to NCCPA and must be received by February 16, 2021. The internship award will be announced by March 2, 2021. The award includes a $6,000 stipend. Conference travel will be offset, and travel/lodging for trips to the NCCPA offices will be reimbursed in accordance with NCCPA’s policies.
Materials and/or questions should be submitted to:
J. B. Weir, Ph.D., Psychometrician National Commission on Certification of Physician Assistants
12000 Findley Road, Suite 100 Johns Creek, GA, 30097
National Center for the Improvement of Educational Assessment
2021 Summer Internship Program in Educational Assessment and Accountability
The National Center for the Improvement of Educational Assessment, Inc. (the Center) is a small nonprofit organization that occupies a unique and influential niche at the intersection of educational measurement and educational assessment policy. The Center is pleased to offer up to four (4) summer internships for advanced doctoral students in educational measurement and/or assessment/accountability policy who want the opportunity to work with the Center’s professionals on projects with direct implications for state and national educational policy.
Note: Given the current Covid-19 context, we plan to offer these internships remotely unless conditions change dramatically between now and June.
The Summer Internship Program
Each intern will work on one major project throughout the summer (to be negotiated between the intern and the Center mentor) and may participate with Center staff on other ongoing projects. The intern will have the opportunity to attend meetings and interact with state assessment personnel. Interns will be expected to produce a written report and a proposal for a research conference (e.g., NCME, AERA), as evidence of successful completion of their project. One of the Center’s senior staff will serve as the intern’s primary mentor, but the interns will interact regularly with many of the Center’s staff. Potential intern projects for 2021 may include the following
1. Improving the interpretability of test score reports. The aesthetics and quality of information presented on test score reports has improved over the last decade, but a survey of state individual student reports conducted by a 2019 Center summer intern (Tanaka, 2019) revealed that error associated with test scores was rarely reported. The Standards for Educational and Psychological Testing (AERA, APA, & NCME, 2014) explicitly call for error or uncertainty of test scores to be included anytime scores are reported. When asked why error is not being reported, many test contractors and state assessment leaders reported that users did not understand how to interpret error and were frustrated trying to make sense of these reports. This internship combines assessment literacy and report design to better understand how we might produce more accurate and useful score reports. This project will involve reviewing assessment literacy research on how best to communicate measurement error, designing report mock-ups, and conducting cognitive laboratories with potential stakeholders to evaluate and refine draft designs.
2. Creating a framework for Opportunity-to-Learn (OTL): Opportunity-to-learn is a more than 50 year-old concept that has evolved from a focus on whether students have had sufficient access to instruction or content linked to particular concepts, to a more robust conception regarding the conditions and resources provided to schools to enable students to succeed. Marion argued for collecting OTL data in 2020-2021 to help contextualize the interpretation of 2020-2021 test scores and because summative assessments, even in the best of conditions, do not provide enough information necessary for policymakers to understand students’ learning context. The intern selected for this project would build on the general framework outlined by Marion to create a detailed set of guidelines for states and school districts describing the types of indicators that should be collected and at what level of the system (e.g., student, district), the types of analyses that should be conducted, including using the OTL data to aid in interpreting student test scores, and how these OTL data should be reported. The intern and mentor will partner with at least one state to analyze actual OTL and use these analyses to help refine the guidance.
3. Investigating the impact of performance assessments on instruction. Performance assessments are considered an authentic measure of student achievement. They are also considered by some educational reformers as an intervention to promote changes to instructional practices in schools. Specifically, to increase the level and complexity of the content that students are asked to learn. We know less about how teachers respond to largescale performance programs than we do about how teachers respond to high-stakes standardized tests. The goal of this project is to better understand how teachers perceive implementing summative classroom performance assessments influences how and what they teach and what students are asked to learn. Project activities will likely include: (1) a review of the literature to contextualize what is known about whether and how teachers change their instructional practices due to the use of summative classroom performance assessments; (2) examination of instruments and techniques used to measure teachers’ instructional changes; and (2) recommendations for approaches states and districts could take to measure the impact of performance assessments on instruction.
4. Modeling student performance on a game based assessment. This internship focuses on the examination of a game-based assessment that is administered over the course of a school year in early literacy and mathematics. This assessment is made up of multiple games aligned to key skills, which are administered over time based on educator needs. This design poses challenges to the construction of an interpretive and supporting validity argument, as well as understanding and modeling student performance over time. Each game is designed to measure a separate early literacy and mathematics skill, and the intended interpretations about students are at the skill level. Therefore, understanding student performance over time is a matter of modeling within-game performance across time for a number of games. The internship project, then, is to first describe performance across time and then model performance across time. Currently, longitudinal item response theory models appear to hold promise for this data, but the interested intern may suggest alternatives based on the initial description of the data. The ultimate aim of the work then is to provide a comprehensive description of student skill performance to inform the ongoing development and use of the games.
5. Emphasizing design in the development of assessment and accountability systems. As states consider a redesign of their current assessment and accountability system(s), they should look to approaches of design used in other industries (technology, pharmaceuticals, manufacturing) for understanding and applying the process for innovation. To date, we have yet to find any studies that synthesize across these various approaches to determine their common and distinct elements of innovation and whether or how they could be applied to the field of education. We contend that a cross-sector analysis of design processes could lead to a more holistic understanding about how innovation happens in developing an assessment and accountability system and could benefit the education research and practice community. To address this gap in the research literature, the goal of this internship is to conduct a crosssector examination of design processes to address three overarching questions: (1) how do different industries design to innovate, (2) what design approaches do they use to implement new products and innovate existing systems, and (3) what can be learned from these cross-sector design approaches to inform assessment and accountability system innovation in K-12 education?
The intern must have completed at least two years of doctoral course work in educational measurement, curriculum studies, statistics, research methods, or a related field. Interns with documented previous research experience are preferred. Further, interns must document their ability to work independently to complete a longterm project. We have found that successful interns possess most of the following skills and knowledge (the importance of the level of skills and knowledge in each of the areas described below is dependent on the specific project):
- Ability to work on a team under a rapid development model
- A deep understanding of educational assessment and its uses including policy and practice
- Content knowledge in a relevant discipline (e.g. science, mathematics, language arts)
- Depending on the project, working knowledge of statistical analysis through multivariate analyses as well as fluency with one or more statistical packages, e.g., SAS, SPSS, R
- A solid understanding of research design
- Psychometrics (both classical and IRT) with demonstrated understanding of the principles of reliability and validity
- An interest in applying technical skills and understanding major policy and practical issues
- Excellent written and competent spoken English skills
The internship duration is 8 weeks and will be conducted remotely this year unless current health conditions change considerably prior to June. The internship will start in early June 2020; the specific date will be determined by the intern and the mentor.
The Center will provide a stipend of $6000 as well as a housing allowance and reasonable relocation expenses (should the internship be in person).
To apply for the internship program, candidates should submit the following materials electronically:
- A letter of interest explaining why the candidate would be a good fit with the Center, what the candidate hopes to gain from the experience, and which project(s) the candidate’s preferred project. Further, the letter should explain both what the candidate could contribute to the preferred project(s) and why the project(s) fits with the candidate’s interests.
- Curriculum vita, and
- Two letters of recommendations (one must be from the candidate’s academic advisor).
Materials must be submitted electronically (including letters of recommendations) to:
Sandi Chaplin at firstname.lastname@example.org and received by February 14, 2021.
National Association of Boards of Pharmacy
The summer psychometrician internship is an eight-week, remote program. During the program, the intern will have an opportunity to gain experience in operational psychometric tasks involved in administering and scoring exams and take primary ownership of an applied research project related to psychometrics under the guidance of a psychometrician mentor. Internships typically result in conference presentations (eg, NCME) and sometimes lead to publication or dissertation topics.
Interns will help define a research question, review related studies, conduct data analyses (real and/or simulated data), and write a summary report suitable for presentation. Current research projects are summarized below. Applicants are encouraged to identify specific projects they prefer to work on and, if desired, to indicate any other research they would like to conduct.
- Building an anomaly detection system for identifying aberrant testing behavior
- Investigating novel optimization techniques for automated test assembly
- Creating a system to identify enemy items using machine learning
- Formative assessments using Bayesian Networks or Cognitive Diagnostic Models
- Novel standard setting methods
- Must be a doctoral student in an educational measurement program or related field, who has completed at least two years of coursework.
- Experience with item response theory preferred.
- Excellent communication skills.
- Demonstrates an interest in licensure and certification testing.
- Proficiency in R and Winsteps preferred.
- Must be authorized to work in the United States for any employer.
To apply for this position, please email the following information to email@example.com.
- Your curriculum vitae with a list of completed graduate courses
- A statement of purpose describing your interests, including preferred projects and general research interests
Application materials must be received by February 19, 2021, and the internship will be announced in mid-March 2021.
No phone calls, please.
Digital Harbor Foundation
Virtual Fellowship in Computational Psychometrics
The Duolingo English Test and Digital Harbor Foundation are launching a new fellowship
program to help train the next generation of assessment researchers. The program will
support three (3) fellows during the summer of 2021 (June 16 - August 4). Applications
are open to doctoral candidates authorized to work in the US, and intentional efforts will
be made to attract diverse candidates from minority-serving institutions. Each fellow will
receive a $7,000 stipend from the Digital Harbor Foundation.
● Completion of two or more years of coursework
● Experience in one or more of the following: test development, IRT, CTT, statistics,
research design, and adaptive testing, programming in R or python
● Working knowledge of statistical software (e.g. R or Python)
● Preferred candidates will also have advanced knowledge of topics such as CAT,
natural language processing, diagnostic modelling, generalizability theory, or
● Active enrollment in doctoral program in measurement, psychometrics, statistics,
cognitive science, machine learning, or related field
● Authorization to work in the US for any employer. If selected, F-1 holders will
need to apply for Curricular Practical Training authorization through their school’s
international student office and have a social security number for payroll purposes
MATERIALS TO SUBMIT
● A current CV
● A cover Letter
● The title of the dissertation and an abstract (not more than 200 words in
single-spaced pages, including references)
● An official letter of recommendation from the applicant's research supervisor
Applications should be submitted by February 15, 2021 at 11:59 pm EDT. Candidates
will be reviewed jointly by the Duolingo English Test and Digital Harbor Foundation
teams. Award notifications will be sent out in March.
Fellowship dates: June 16, 2021 - August 4, 2021
HOW TO APPLY
● Applicants: Email your application as an attachment to
firstname.lastname@example.org with the subject line “[your last name]
Computational Psychometrics Fellowship”
● Supervisors: Email your letter of recommendation as an attachment to
email@example.com with the subject line “[applicant’s last name]
Computational Psychometrics Fellowship.” The letter of recommendation must
come directly from the supervisor
Previous award winners of the Duolingo doctoral awards will not be considered.
FOR ANY INQUIRIES
For any inquiries regarding the application process or the grant, please contact us at
firstname.lastname@example.org. Responses to general questions will be
posted in FAQs on the Duolingo English Test FAQ page.
The Curriculum Associates Psychometric and Research Summer 2021 Internship
Curriculum Associates is seeking doctoral students for a summer internship focused on psychometric and research projects related to the award-winning i-Ready® Diagnostic, as well as the nationally recognized i-Ready Personalized Instruction system.
About Curriculum Associates
Curriculum Associates is a rapidly growing educational technology and publishing company committed to making classrooms better places for teachers and students. Our assessments provide valuable feedback to teachers and students and are primarily used to place students into individualized instructional paths.
About the Internship
The Curriculum Associates Psychometric and Research Summer Interns will work on both operational psychometric and research projects and will also have the opportunity to get exposure to many aspects of the assessment design and development process including observing a standard setting and a Technical Advisory Committee meeting. Specifically, interns will:
- Analyze the large datasets at Curriculum Associates, which include some of the most extensive assessment and instructional data in the nation.
- Collaborate with other members of the assessment development, psychometrics, and research teams.
- Learn about how an operational computerized adaptive assessment is managed.
Interns will focus on at least one major project with the goal of presenting findings at a national conference. Some general topics include the following:
- Psychometric modeling of the i-Ready Instruction lesson quizzes.
- Develop instructional profiles based on longitudinal usage metrics.
- Examine the relationship between diagnostic duration, number of testing sessions, and student engagement.
- Research related to the COVID-19 pandemic and/or inequities in our educational systems.
- Assist in the development and refinement of a validity argument for i-Ready Instruction.
- Investigate IRT calibration requirements in a vertically scaled CAT.
- Investigate and define requirements for a representative equating sample to implement through a continuous field test design.
- Develop data visualization options for IRT and classical item statistics.
- Explore DIF methodologies for large scale assessments in a CAT environment.
Note that the lists above are not exhaustive. Interns may find interest in a project not listed here or may develop a research question of their own.
Qualifications for Interns
Candidates for the internship will ideally have the following qualifications:
- Currently pursuing a doctorate in an educational measurement or related field. A preference will be given to students with at least two years of doctoral-level coursework completed by the start of the internship.
- Must have intermediate proficiency in data manipulation and analysis in SAS and/or R.
- Must have a strong interest in applied psychometrics and research that impacts student learning.
Funding, Length, and Location of Internship
- Interns will work remotely for eight weeks during the summer, ideally from June 14-August 6. Flexibility can be offered in the internship dates if needed.
- Interns will be considered temporary employees of Curriculum Associates.
- Interns work 40 hours/week and will be paid $25 per hour (equivalent to $8,000 for 8 weeks).
Questions and Next Steps
Please provide the following information in your application:
- A cover letter indicating your preference between the psychometric and research track and outlining your current research interests, the reason for your interest in the internship, and the name of your advisor.
- A current resume / CV.
- A letter from your advisor that indicates your advisor is supportive of your involvement in the internship at this point in your graduate work.
Applications will be considered on a rolling basis, with a final deadline of March 1, 2021 at 9:00am ET. Potential interns will be interviewed over video and final selections will be made in mid-March.
Apply here: https://jobs.jobvite.com/careers/curriculumassociates/job/ojovefwa?__jvst=Employee%20Referral&__jvsd=seH0Lhws&__jvsc=email&bid=nkjuFSw9
RLE Psychometrics Internship
Edmentum is committed to making it easier for educators to individualize learning for every student through simple technology, high-quality content, actionable data, and customer success. Founded in innovation, Edmentum’s powerful learning solutions blend technology with individual teaching approaches. We are dedicated to being educators' most trusted partner in creating successful student outcomes everywhere learning occurs. Our commitment is built off the emphasis we place on our core values: passion, people, respect, collaboration, and performance.
At Edmentum, we are committed to building research-based curriculum and assessment solutions that empower educators and improve student outcomes across the globe. The Edmentum Research and Learning Engineering Summer Internship - Psychometrics is a paid internship opportunity for curious and collaborative graduate students that are interested in pursuing a career as a psychometrician and are deeply interested in exploring educational solutions that truly meet the needs of learners and educators.
For the summer 2021 internship, the psychometrics intern will collaborate with Edmentum Research Scientists and Learning Engineers to conduct operational psychometric analyses to examine the reliability and validity of Edmentum’s Exact Path computer adaptive diagnostic assessment for students in K-12. The intern will gain experience in applied psychometric analyses, computer adaptive testing, educational technology, processes and tools used in the psychometric and educational technology industries, and working with large data files (>10 million records).
With support from Edmentum Research Scientists and Learning Engineers, the psychometrics intern will:
- Modify existing analysis code and write new code to prepare data and perform psychometric analyses
- Interpret analysis findings and provide written descriptions to update the Exact Path technical report
- Explore and recommend tools for modernizing the format of the Exact Path technical report (e.g., by shifting away from Word documents and PDFs)
- Create internal dashboards with real-time monitoring of the psychometric quality of the Exact Path assessment
Edmentum is also interested in supporting the intern’s unique research interests for projects that can be completed with existing or simulated data and have promise to lead to conference presentations and/or publications while simultaneously improving Edmentum products. Although not required, interns with particular research questions of interest related to K-12 computer-adaptive testing are encouraged to propose research ideas in their cover letter.
- Enrolled in a graduate program in educational measurement, psychometrics, statistics, learning sciences, or a related field
- Have completed at least one year of graduate-level statistics coursework
- Have completed at least one introduction to measurement / psychometrics course
- Proficient with basic R programming skills (data preparation, descriptive statistics, data visualizations) with desire to learn more advanced skills
- Excellent writing skills
- Self-starter that can work independently (with appropriate support provided from Edmentum Research Scientists)
This position is remote and will last 8 to 12 weeks depending on the intern’s availability. Due to frequent collaboration with Research and Learning Engineering team members, working hours will follow typical central time business hours with some flexibility as needed.
Edmentum is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender identity, sexual orientation, national origin, disability status, protected veteran status, or any other characteristic protected by law.
To apply: https://recruiting.ultipro.com/PLA1009PLATO/JobBoard/04e3a767-9600-42c4-86c9-5912bbf18813/OpportunityDetail?opportunityId=e8fdec82-72ad-4057-9ede-05100c5c9d6f
National Board for Professional Teaching Standards
2021 Summer Assessment Research Internship
The National Board for Professional Teaching Standards’ mission is to advance the quality of teaching and learning through a voluntary advanced certification.
The National Board for Professional Teaching Standards is seeking a well-qualified individual for a remote internship opportunity to gain experience in assessment research. The intern with work remotely with the National Board’s psychometrician to determine which research projects to pursue. The Summer Assessment Research Internship will start on or around June 14, 2021; the specific date will be determined by the intern and the psychometrician and last no more than 8 weeks on a part-time basis.
Assist with research in support of the National Board’s test development processes and psychometric methodologies.
Summary of Key Responsibilities:
Three research projects are presented below. All research projects involve manipulating large amounts of data and the ability to carefully verify assumptions and accuracy of data. The research project that the intern will work on will be determined primarily on the interest and experience of the intern. The intern and the National Board’s psychometrician will work together to define the research question; review related research; become familiar with the data; plan and conduct data analyses; and write a report. Please review the research projects below.
Analysis and Reporting: Assessment Performance by State
States and districts vary in the supports and incentives they offer for candidates pursuing National Board Certification. The supports and incentives are dependent upon the funding the state or district receives as well as other policies at that level such as classroom release time. What is the relationship of state and district policies, including those of support and incentives, to certification outcomes such as the percentage of their candidates being successful at earning certification.
The intern can expect to analyze the relationship of supports, incentives and challenges to performance on the certification assessment at the state and/or district level and produce a report on his/her findings and suggestions for further research.
Survey Research: Attrition
There are several steps an educator must take to complete the certification process: creating an account, registering, purchasing assessment components, submitting the assessment components within a required timeline, and meeting the score requirements for certification. Who are the educators who drop out of the certification process at the various steps along the way and why do they drop out? Having this information will help the National Board evaluate challenges related to certification completion as well as threats to the program’s accessibility.
The intern can expect to analyze the demographics of those who discontinued the process at various stages, develop and conduct a survey that aims to find out why those educators did not take the step in the certification process, and produce a report on his/her findings and suggestions to reduce attrition.
Analysis and Reporting: Performance Feedback Statements
It’s well known that National Board candidates crave feedback on how they performed on an assessment, especially if they need to retake part of the assessment in order to meet the score requirements for certification. Feedback statements appear on a candidate’s score report to guide him/her in the general areas in which he/she might want to target improvement upon retake. A separate set of feedback statements has been developed for each of the three portfolio components at each of the three score levels.
The intern can expect to analyze the frequencies and patterns of feedback statements by portfolio component and produce a report on his/her findings and suggestions for application and further research.
Summary of Experience and Required Competencies (Knowledge, Skills and Abilities):
· Graduate student from an accredited university preferably in educational measurement, curriculum studies, statistics, research methods, or a related field ; quantitative psychology or education measurement with an emphasis on psychometrics and statistical data analysis; completion of two or more years of graduate coursework.
· Intern must be able to work independently and exercise sound judgment, while meeting remotely 1-2 times a week with the National Board’s psychometrician.
· Intern must have access to statistical software and be competent in programming within that software.
· Ability to manage and query large data sets.
· Strong research, analytical, writing, and communication skills.
· Must be authorized to work in the US for any employer. If selected, F-1 holders will need to apply for Curricular Practical Training authorization through their school's international student office, and have a social security number for payroll purposes.
Position Type and expected hours of work:
· This is a Part Time Temporary position for a Time Limited Assignment beginning on or around June 14, 2021 and lasting up to eight weeks on a part-time basis.
· The National Board's office is located in Arlington, VA; however, National Board staff are currently working remotely due to COVID-19 and the intern will be expected to work remotely as well.
· Total compensation for the two months is approximately $2,500.
1. A cover letter including your research area(s) of interest and your experience in that area(s), what you hope to achieve from the internship, what you can contribute to the organization, and evidence of your ability to work independently on a research project,
2. Your current resume or curriculum vitae, and
3. Two letters of recommendations (one must be from your academic advisor).
The main criteria for selection will be the match of applicant interests and experience with the research projects. Application deadline is April 9, 2021 Late or incomplete applications will not be considered. All applicants will be notified of selection decisions by May 15, 2021.
Apply at https://nbpts.applytojob.com/apply/WAIQmeg0f6/Assessment-Research-Summer-Intern?source=NCME
The National Board provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training.
Graduate Management Admission Council
2021 INTERNSHIP PROGRAM PSYCHOMETRIC RESEARCH
GMAC is pleased to offer a paid eight-week internship this year to one advanced graduate student motivated to conduct innovative research with the Test Development and Psychometrics Department. Possible research topics include (but are not limited to) response time modeling, speeded exams, guessing behavior, penalty scoring, automated item generation, differential item functioning, and non-cognitive assessment. Please visit gmac.com for general information about the organization.
- Openings: 1
- Dates: Any period of 8 consecutive weeks between July 5th and October 29th (preferred: September 6th to October 29th)
- Location: On-site at GMAC headquarters in Reston, VA or remote if necessary (preferred: on-site pending COVID-19 safety)
- Stipend: $8,000 all-inclusive (intern is responsible for relocation, housing, transportation, and living expenses)
- Deliverable: In collaboration with an assigned mentor, intern is expected to produce original research submitted as a GMAC white paper and/or a conference proposal for NCME, AERA, IMPS, etc.
- Current enrollment and good standing in a doctoral program in Educational Psychology, Quantitative Psychology, Statistics, or a related field
- At minimum three years of coursework in statistics and psychometrics, particularly covering the fundamentals of measurement theory (preferred: dissertation stage)
- Demonstrated potential for quality research (e.g., completed or on-going projects, master’s thesis, preliminary exam, conference presentations, publications)
- Experience in statistical programming (using R, SAS, Python, etc.) and familiarity with adaptive testing
- Materials: (1) CV/resume; (2) brief cover letter highlighting relevant experience, research interests, flexibility in date and location of internship; (3) unofficial transcript; (4) one letter of recommendation § Submit all materials as email attachments to email@example.com
- Deadline: April 30, 2021 - applicants will be notified of decisions by May 15th