Internships & Fellowships



         
          The following information is maintained by the The Graduate Student Issues Committee (GSIC). 

          If you would like to have your internship or fellowship listed here please contact our GSIC co-chair:
          Maura O'Riordan: moriordan@umass.edu 
   


Graduate Student Internships

Internships are a valuable way to link your academic experience with the professional arena. Below is a list of internships that will allow students to go beyond the classroom and conduct practical research with a mentor from a testing company or research agency.


Graduate Student Fellowships

Fellowships provide structured work experience and professional development that include intensive training and experiential learning. Below is a list of fellowships that provide support to the fellow's growth and opportunities to explore a particular field of measurement.

American Board of Internal Medicine 

American Board of Internal Medicine Internship

Summer 2021 Psychometric Internship Announcement 

The ABIM is pleased to announce the return of its summer internship in psychometrics. 

Internship Opportunity 

The ABIM’s psychometric internship program is an eight-week summer internship running from Monday, June 14th to Friday, August 6th in Philadelphia, PA*. During the program, the intern will take primary ownership of an applied psychometric research project under the guidance of one of the ABIM’s measurement scientists. The intern will also have opportunities to assist psychometric staff on other research projects and to learn about operational processes (e.g., item analysis, IRT calibration, equating) within the context of medical certification. 

* If necessary for health and safety reasons, the ABIM summer internship will be conducted remotely. ABIM will communicate with candidates about such decisions in the spring. 

† If the internship is conducted remotely, interns will not receive the housing allowance. 

Qualifications 

  • Doctoral student in an educational measurement (or related field) program with at least two years of coursework completed by the start of the internship 
  • Preference will be given to applicants who have experience with item response theory
  • Excellent communication skills
  • Interest in certification testing
  • Eligible to be legally employed in the United States 

Stipend 

The ABIM provides a total of $10,000 for the eight-week internship program. This total includes an $8,000 stipend as well as a $2,000 housing allowance. 

Research Projects 

For their primary research project, the intern should expect to perform all stages of the research process, from literature review to discussion and dissemination of results. At the conclusion of the program, the intern will be expected to share their results by giving a brief presentation to an audience of psychometric staff. Further, the intern will be encouraged to submit their summer project(s) for presentation at a professional conference and/or for publication. The intern will work with their mentor to select an appropriate project for their experience level and interests. Examples of previously completed internship projects can be found on the next page. 

Application 

Please submit your curriculum vitae and a letter of interest to Michele Johnson, Research Program Manager (researchintern@abim.org) by Monday, February 1st 2021. 

Examples of Previously Completed Internship Projects 

Please note these are examples of previous ABIM Interns’ projects. The summer 2021 internship project will be determined jointly by the 2021 mentor and summer intern. 

  • Anchor Item Replacement in the Presence of Consequential Item Parameter Drift. The purpose of this project was to develop evidence-based recommendations for replacing anchor items when a significant number are flagged for exhibiting item parameter drift. The intern conducted a simulation study that investigated different item replacement strategies, and examined the effects of each strategy with respect to outcomes such as pass/fail classification accuracy and RSMD of thetas. 

Chang, K., & Rewley, K. Currently under review. 

  • Evaluating Use of an Online Open-Book Resource in a High Stakes Credentialing Exam. This project examined item and examinee characteristics associated with the use of an open-book resource throughout a high-stakes medical certification exam. Using exam process data and a generalized estimation equations modeling framework, the intern examined use of the open-book resource and how it might affect examinees’ test-taking experience and performance. 

Myers, A. & Bashkov B. Paper presented at NCME 2020. 

  • Investigating the Impact of Parameter Instability on IRT Proficiency Estimation. This project examined how poorly estimated item parameters impact different proficiency estimators. The intern conducted a simulation study to examine how different levels of parameter instability impact Bayesian vs. non-Bayesian estimators as well as pattern vs. summed-score estimators. 

McGrath, K. & Smiley, W. Paper presented at NCME 2019. 

  • Using Data Visualization to Explore Test Speededness in Certification Exams. This project examined different ways to determine if a test is speeded. The intern conducted a thorough literature review of methods used to detect and quantify test speededness. She then used data visualization, a nonparametric tool that does not have any data assumptions, to examine operational test data for speededness. This was shown to be a viable approach to assessing the impact of examination timing. 

Sullivan, M. & Bashkov, B. Paper presented at TIME 2017. 

  • Automatic Flagging of Items for Key Validation. This project developed an automatic method for determining if there is a problem with an item’s key. The intern collected data from psychometricians regarding which items required key validation and used logistic regression to mimic that professional judgement to automatically flag problematic items. 

Sahin, F. & Clauser, J. Paper presented at NCME 2016 

National Board of Medical Examiners 

National Board of Medical Examiners Internship

Summer 2021 Internships in Assessment Science and Psychometrics

June 7 - July 30, 2021   Philadelphia, PA

This year’s internship will be virtual, due to uncertainty surrounding the Covid-19 pandemic. 

Interns will interact with other graduate students and NBME staff and will present completed projects or work-in-progress to NBME staff. Internships typically result in conference presentations (e.g., NCME) and sometimes lead to publication or dissertation topics. 

Requirements 

  • Active enrollment in doctoral program in measurement, statistics, cognitive science, medical education, or related field; completion of two or more years of graduate coursework.
  • Experience or coursework in one or more of the following: test development, IRT, CTT, statistics, research design, and cognitive science. Advanced knowledge of topics such as equating, generalizability theory, or Bayesian methodology is helpful. Skill in writing and presenting research. Working knowledge of statistical software (e.g., Winsteps, BILOG; SPSS, SAS, or R).
  • Interns will be assigned to one or more mentors but must be able to work independently.
  • Must be authorized to work in the US for any employer. If selected, F-1 holders will need to apply for Curricular Practical Training authorization through their school’s international student office and have a social security number for payroll purposes. 

Compensation 

Total compensation for the two months is approximately $8000. 

Research Projects 

Interns will help define a research problem; review related studies; conduct data analyses (real and/or simulated data); and write a summary report suitable for presentation. Projects are summarized below. Applicants should identify 2 projects by number that they prefer to work on. 

1. Exploring Response Process Validity Evidence for a Medical Licensing Examination Program. Passing scores on the United States Medical Examination (USMLE) sequence are intended to signal readiness for the unsupervised practice of medicine in the United States. This project will examine response process validity evidence for USMLE score interpretations. USMLE is comprised of three computer-based multiple-choice question examinations: Step 1, Step 2 Clinical Knowledge, and Step 3. For each multiple-choice question on each Step examination, various testing features are available to examinees. These features include the ability to 1) highlight examination text, 2) cross out examination text 3) take notes in a blank window, and 4) view lab values. This project will examine the associations among performing these activities and performance on the examinations. Inferential statistical analyses will be done for all examinees as well as by examinee subgroup, defined by such characteristics as gender and location of medical school. Among other responsibilities, an intern for this project would help to design the study, manage and manipulate a large complex data set, conduct statistical analyses, and interpret and present/describe results. 

2. Virtual Performance Assessment. The United States Medical Licensing Examination (USMLE) Step 2 Clinical Skills Examination (Step 2 CS), in existence since 2004, is undergoing a revitalization. Before the planned rollout of the restructured exam, the NBME will be conducting multiple research studies. Possible projects include work on generalizability, quality assurance metrics, or evaluation of scoring/equating designs for this virtual examination. 

3. What Determines Physician Competence? The USMLE assesses the competence level of future physicians based on the system of tests that check the breadth and depth of medical knowledge, as well as clinical and doctor-patient communication skills. In addition to the USMLE scores, a plethora of exam scores are collected throughout medical school and residency. The goal of this project is to examine the relationships between the various intermediate educational medical tests and the ultimate physician competence indicators. 

4. Interactive Psychometric Dashboards. Contemporary methods of communicating score information are evolving beyond fixed displays on traditional 8 ½ by 11-inch page layouts. Additionally, score users demand more flexibility and specificity to draw tailored inferences in support of evidence-based decisions. For this project the intern will contribute to the development of dashboard products using R Shiny and participate in engagement efforts with internal and external customers. 

5. Using NLP to Explore Item Bias and Test Validity. The use of Natural Language Processing (NLP) within applications, in conjunction with psychometric theory and statistics, has shown great promise for providing innovative information that was not previously available in the item and test development processes of high-stakes licensure programs. This project will focus on the use of basic NLP techniques and application development intended to analyze text in large item banks, to improve item and test development and provide more evidence for test fairness and the validity of test scores. An intern with little or no NLP experience, but who is interested in learning more about this field, may be suitable. Interns with previous NLP experience will have the possibility of incorporating more advanced NLP techniques and analyses into their work, depending on the project scope and other projects underway at NBME. 

6. Measurement Instrument Revision and Development. This project will continue ongoing work on revising a commonly used measurement instrument so that the appropriate inferences can be made about medical student well-being. Duties may include the following: working with subject-matter experts to revise the existing items; conducting think-alouds with medical students; developing a pilot measure of potential items; exploratory and confirmatory factor analysis of initial pilot results to gather structural validity evidence; developing a larger survey to gather concurrent and discriminate validity evidence with the revised measure; and administration and evaluation of the larger survey. 

7. Qualitative Analysis of Focus Group and Interview Data. The RENEW (Re-examining Exams: NBME Effort on Wellness) task force at NBME is focused on understanding the relationship between the pressures of high-stakes licensure examinations and medical student well-being. As part of this work, a series of focus groups and interviews have been completed, but this data has not been exhaustively utilized and analyzed. The intern assigned to this project will apply qualitative and mixed-method methodologies to learn more about the information gathered during the focus groups and interviews. 

8. Computer-Assisted Scoring of Constructed Response Test Items. Relying on subject matter experts to score constructed-response test items is expensive, time-consuming, and introduces natural scoring variations inherent when using human raters. Recently the NBME has developed a computer-assisted scoring program that utilizes natural language processing (NLP) to mitigate these issues. The two main components of the program are (1) ensuring that the information in the constructed response is correctly identified and represented; and (2) building a scoring model based on these concept representations. Current areas of research surrounding this project include (but are not limited to): refining quality control steps to be taken prior to an item being used in computer-assisted scoring; linking and equating computer-assisted scores with human rater scores; evaluating a scoring method based on using orthogonal arrays; and developing metrics that assess item quality and test reliability when computer-assisted scores and human scores are used to make classification decisions. The final project will be determined based on a combination of intern interest and project importance. 

9. Modern Test Construction Methods. The art and science of constructing an efficient and effective high-stakes exam requires balancing several competing aims. These theoretical and statistical considerations are explicitly enumerated when constructing an exam, resulting in a set of constraints that require complex algorithms to build secure test forms. Recent research has suggested that some of these constraints may be unnecessarily restrictive, resulting in test item banks that aren’t efficiently utilized and increased costs for test development and construction. An intern for this project would review current practices for exam construction and examine which constraints may be relaxed while still administering an exam with appropriate domain coverage and exam security considerations. 

Application 

Candidates may apply by going to https://nbme.applicantpro.com/jobs/. A cover letter outlining experience and listing project interests by number, along with a current resume, are required. Application deadline is January 31, 2021. 

All applicants will be notified of selection decisions by February 26, 2021.

Overview 

NBME offers a versatile selection of high-quality assessments and educational services for students, professionals, educators, and institutions dedicated to the evolving needs of medical education and health care. To serve these communities, we collaborate with a comprehensive array of professionals including test developers, academic researchers, scoring experts, 

practicing physicians, medical educators, state medical board members, and public representatives. 

Together with the Federation of State Medical Boards, NBME develops and manages the United States Medical Licensing Examination. In addition, we are committed to meeting the needs of educators and learners globally with assessment products and expert services such as Subject Examinations, Customized Assessment Services, Self-Assessments, the International Foundations of Medicine, and Item-Writing Workshops. 

We also provide medical education funding and mentorship through the Latin America Grants, Stemmler Fund, and Strategic Educators Enhancement Fund, which serve to advance assessment at educators' and health professionals' own institutions. 

NBME employs approximately 30 doctoral level psychometricians and assessment scientists, as well as several MDs specializing in medical education. Staff is recognized internationally for its expertise in statistical analysis, psychometrics, and test development. 

Learn more about NBME at NBME.org. 

Diversity, Equity, and Inclusion: 

At NBME, we continue to innovate and improve how we fulfill the evolving needs of the health care community. This commitment starts and ends with the people at NBME. By recruiting and empowering talented individuals from various disciplines and backgrounds, which includes professionals with diverse life experiences, abilities, and perspectives, NBME can take a well-informed, robust approach to advancing medical education and assessment for years to come. Learn more about NBME at NBME.org. 

 

Educational Testing Service I (ETS)

2021 Summer Research and Measurement Sciences (RMS)
Internship Program for Graduate Students

Description

If you are a creative and innovative individual who wants to help shape the future of learning and assessment, we encourage you to apply for the 2021 Summer Research and Measurement Sciences (RMS) Internship program. Steeped in decades of broad expertise, RMS conducts rigorous foundational and applied research on the most critical issues facing education and the workforce. Central to ETS’s legacy of global leadership in learning and assessment, RMS is dedicated to advancing the science and practice of measurement, driving innovation in digital assessment, learning and teaching.

Applying for an Internship at RMS

As an intern in RMS, you’ll work with experts who are nationally and internationally known as thought leaders, trusted advisors and go-to collaborators for their high-impact work addressing significant educational and societal goals. ETS staff in RMS have expertise in psychology, education, psychometrics, measurement, statistics, cognitive or learning sciences and data science.

Interns who are accepted into the program will collaborate with scientists on projects related to these topics and will participate in data analysis, writing and other research tasks. Doctoral students who have completed at least two years in one of these or a related field are encouraged to apply. Upon the completion of the program, you’ll have the opportunity to present your findings to teams across R&D.

Note: Applicants may apply to the RMS or AI Labs Internship programs, but not both. However, all applicants may be considered for both programs, depending on qualifications and project needs.

Application Procedures

Complete the electronic application form. On the application form:

  • Choose up to two research areas in which you are interested and provide written statements about your interest in the area(s) of research and how your experience aligns with the project.
  • Attach a copy of your curriculum vitae (preferably as a PDF).
  • If you are (or have been) actively enrolled in a graduate program, attach a copy of your graduate transcripts (unofficial copies are acceptable).
  • Download the recommendation form and share it with your recommenders. The link to the recommendation form is on the application. Recommendations should come from an academic advisor, a professor who is familiar with your work as it relates to the project of interest or an individual with whom you have worked on a closely aligned project. ETS will only accept two recommendation forms.Recommendations should be sent electronically to internfellowships@ets.org and must be received by February 1, 2021. If you would like to download the recommendation form to send to your recommenders before submitting your application, you can save your application information for completion later.

Deadline

  • The application deadline is February 1, 2021.

Decisions

  • Applicants will be notified of selection decisions by March 31, 2021.

Duration

  • Ten weeks: June 1, 2021–August 6, 2021

Compensation

  • $7,500 salary

Eligibility

  • Current full-time enrollment in a relevant doctoral program
  • Completion of at least two years of coursework toward the doctorate prior to the program start date

Selection

The main criteria for selection will be scholarship and the match of applicant interests and experience with the research projects.

ETS affirmative action goals will be considered. We strongly encourage students from under represented groups and backgrounds to apply. Late or incomplete applications will not be considered.

For more information please visit:

www.ets.org/research/internship-fellowship/summer-rms-intership

Educational Testing Service II (ETS)

2021 Summer AI Research Labs Internship Program
Description

The AI Research Labs’ work drives the innovation and development of teaching and learning technologies that are grounded in research and powered by next-generation AI. The Labs are dedicated to working closely with end-users to uncover real-world needs, and co-designing and prototyping solutions to meet those needs. Our staff is made up of research and learning scientists, software developers, research engineers, user experience researchers and designers, instructional designers and producers, and product owners.

Applying for an Internship in the AI Research Labs

Interns accepted into the AI Research Lab summer program will participate in user needs, discovery and exploration, solution ideation and validation, capability and prototype development, iterative user validation and data-driven solution optimization. We work in agile development teams to apply the best of foundational learning and cognitive science to the design, development and testing of solutions to meet educator and learner needs. Upon the completion of the program you’ll have the opportunity to present your findings to teams across R&D.

Applicants who have interest and expertise in the following would be a great fit for this program:

  • learning or cognitive science
  • software development
  • AI and ML engineering
  • user experience research and/or design
  • instructional design
  • product ownership

Note: Applicants may apply to the RMS or AI Labs Internship programs, but not both. However, all applicants may be considered for both programs, depending on qualifications and project needs.

Application Procedures

Complete the electronic application form. On the application form:

  • Identify the Lab that you are interested in and provide a written statement about your interest in the area(s) of research in the Lab and how your education and experience align with the work of the Lab.
  • Attach a copy of your curriculum vitae (preferably as a PDF).
  • If you are (or have been) actively enrolled in a graduate program attach a copy of your graduate transcripts (unofficial copies are acceptable).
  • If you were accepted into a graduate program and deferred enrollment, attach proof of acceptance.
  • Download the recommendation form and share it with your recommenders. The link to the recommendation form is on the application. Recommendations should come from an academic advisor, a professor who is familiar with your work as it relates to the project of interest, or an individual who you have worked with on a closely aligned project. ETS will only accept two recommendation forms. Recommendations should be sent electronically to internfellowships@ets.org and must be received by February 1, 2021. If you would like to download the recommendation form for sending to your recommenders before submitting your application, the option to save your application information for later is available.

Deadline

  • The application deadline is February 1, 2021.

Decisions

  • Applicants will be notified of selection decisions by March 31, 2021.

Duration

  • Ten weeks: June 1, 2021–August 6, 2021

Compensation

  • $7,500 salary

Eligibility

  • Completion of bachelor's degree
  • Actively enrolled or accepted into a graduate program aligned to a Lab focus (students who have deferred enrollment due to extenuating circumstances will be considered)

Selection

The main criteria for selection will be the match of applicant interests and experience with the focus of the Labs.

ETS affirmative action goals will be considered. The Research Labs value building teams of individuals from diverse backgrounds and with diverse experiences. We strongly encourage students from underrepresented groups and backgrounds to apply. Late or incomplete applications will not be considered.

For more information please visit:

https://www.ets.org/research/internship-fellowship/summer-ai-research-labs-internship

Educational Testing Service III (ETS)

National Assessment of Educational Progress (NAEP) Internships

ETS-NAEP Summer Internship Opportunities

ETS Research & Development is committed to developing a talent pipeline of diverse researchers and scientists in data science, data analysis, psychometrics, test fairness, validity, measurement and statistics. Students can apply for learning opportunities at the undergraduate, graduate or post-doctoral level through externships, internships, fellowships or post-doctoral appointments. All appointments within the ETS-NAEP Program are based on a business research project model and designed to expose individuals to our mission and culture, the day-to-day operations and value of working in educational testing and assessment, and how the National Center for Educational Statistics (NCES)-NAEP and ETS-NAEP collaborate to support a dynamic national educational imperative.

The Internship Experience

As part of our 2021 Summer internship experience, our interns will work virtually for an eight-week period on a business research project and will be paired with an ETS R&D NAEP staff member who will serve as their mentor. Since our program is designed to expose students to careers in our field, our interns also receive learning and professional development experiences that include, but are not limited to:

  • individual professional mentoring/coaching
  • weekly research seminars led by experts in the fields of research, measurement, psychometrics and statistics
  • exposure to connect and learn from leaders across NCES-NAEP
  • professional presentation opportunities and exposure to opportunities for future full-time employment in ETS-NAEP

Interested students can apply for one of two internships.

The Summer Undergraduate Research Experience (SURE) — Undergraduate Students

We provide business research project-based experiences for diverse students starting from those who are entering their third year of study through recent (within the past semester) graduates. Students must be enrolled in a four-year United States-based accredited institution with at least one year of classroom and/or external research experience. All majors are welcomed.

Apply for the SURE internship

The Summer Pre-Doctoral Research Experience (SPRE) — Graduate Students

Individuals currently enrolled in master's or doctoral-level programs will engage in learning experiences designed to apply their educational training with applied experiences working on a business research project. Diverse students majoring in degree programs in psychology, psychometrics, data science, learning science, computer science, cognitive science, artificial intelligence and/or machine learning who have completed at least two years of coursework are encouraged to apply.

Apply for the SPRE internship

ETS-NAEP University Partnership Efforts

NCES-NAEP and ETS Research & Development have shared missions to inform, learn from, support and engage the communities where we work and live in the key areas within NAEP: Psychometrics & Data Analysis, Assessment Design & Development, and Survey Instrument Design. As part of this ongoing commitment, we work with colleges and universities that have programs/majors that align with the business areas within NAEP to provide students and faculty with learning, professional development and employment opportunities that expose them to NAEP’s work.

ETS and Howard University

Established in 2012, the ETS-NAEP Statistics & Evaluation Institute provides students, faculty and staff with no-cost learning opportunities in the areas of research, assessment, and quantitative and qualitative research. Over the eight-week period at the Institute, participants attend daily seminars that consist of a hybrid of lecture/classroom and applied/lab learning. This model encourages individuals to actively learn and then apply the knowledge gained in an iterative manner, and includes ongoing feedback and coaching from the faculty. Since the program is supported by NCES-NAEP, participants also learn directly from staff who work within NAEP and they are provided with opportunities to connect with us for visiting and full-time employment positions.

Human Resources Research Organization

Internship in Educational Measurement

HumRRO

Since 1951, HumRRO has applied social science research to enhance human performance
and organizational effectiveness. Our 90-plus professional researchers hold advanced degrees in industrial-organizational, experimental, quantitative, or social psychology; education research and measurement; or other social science disciplines. HumRRO is an independent, nonprofit organization that conducts human resource research and analyses for Federal, state, and local agencies. Our clients also include professional associations and private sector companies. To learn more about HumRRO, visit our website at
www.humrro.org.

Location

Summer interns will have the opportunity to work under the direction and supervision of senior staff in one of HumRRO’s offices. HumRRO headquarters is located in Alexandria, Virginia—just outside of our nation’s capital. Our office overlooks the Potomac River in historic Old Town. Other HumRRO offices are in Louisville, Kentucky; Monterey, California; and Minneapolis, Minnesota.

Internships

HumRRO’s internships are available to students currently enrolled in a full-time Educational Assessment (or related fields) master’s or Ph.D. accredited program. Our summer internships are paid, full-time opportunities and include paid vacation and a housing stipend. Internship duties and responsibilities vary depending upon the technical content and timeline of the particular projects in need of staffing. Typical tasks include literature review, synthesis, and analysis; data collection, entry, and analysis; survey and other instrument development and administration; and documentation of findings.

HumRRO is looking for responsible, motivated team players with strong technical and communication skills to assist in our contract research activities. HumRRO project directors will provide technical supervision and mentoring to help make the most of the intern’s exposure to, and experience with, our applied research setting.

Selection

Selection is competitive and will be based on a holistic review of application materials for evidence
of communication and interpersonal skills, research experience and promise, initiative and motivation, academic coursework, and professional interests. We seek applicants who have a background in research methods and some experience conducting research (e.g., working on a research team, presenting as a co-author at a conference). Strong applicants also demonstrate their ability to participate in research activities, such as writing literature reviews, collecting data, and/or analyzing data. Finalists will be interviewed telephonically.

Application Process

Deadline: HumRRO must receive all applications for the summer internship on or before: January 15

Each applicant must provide:

  1. A curriculum vitae/detailed resume

  2. A one-page personal statement of career goals and internship interests

  3. Official transcript(s) of all graduate work

  4. Contact information for two references (letters of recommendation are not needed)

  5. Completed Statistical Packages and Data Analysis Questionnaire (link on website below)

Applications should be emailed to: internship@humrro.org

You can find more information and FAQs by visiting: https://www.humrro.org/corpsite/internships/.

 




National Board of Medical Examiners Internship

Summer 2021 Internships in Assessment Science and Psychometrics

June 7 - July 30, 2021   Philadelphia, PA

This year’s internship will be virtual, due to uncertainty surrounding the Covid-19 pandemic. 

Interns will interact with other graduate students and NBME staff and will present completed projects or work-in-progress to NBME staff. Internships typically result in conference presentations (e.g., NCME) and sometimes lead to publication or dissertation topics. 

Requirements 

  • Active enrollment in doctoral program in measurement, statistics, cognitive science, medical education, or related field; completion of two or more years of graduate coursework.
  • Experience or coursework in one or more of the following: test development, IRT, CTT, statistics, research design, and cognitive science. Advanced knowledge of topics such as equating, generalizability theory, or Bayesian methodology is helpful. Skill in writing and presenting research. Working knowledge of statistical software (e.g., Winsteps, BILOG; SPSS, SAS, or R).
  • Interns will be assigned to one or more mentors but must be able to work independently.
  • Must be authorized to work in the US for any employer. If selected, F-1 holders will need to apply for Curricular Practical Training authorization through their school’s international student office and have a social security number for payroll purposes. 

Compensation 

Total compensation for the two months is approximately $8000. 

Research Projects 

Interns will help define a research problem; review related studies; conduct data analyses (real and/or simulated data); and write a summary report suitable for presentation. Projects are summarized below. Applicants should identify 2 projects by number that they prefer to work on. 

1. Exploring Response Process Validity Evidence for a Medical Licensing Examination Program. Passing scores on the United States Medical Examination (USMLE) sequence are intended to signal readiness for the unsupervised practice of medicine in the United States. This project will examine response process validity evidence for USMLE score interpretations. USMLE is comprised of three computer-based multiple-choice question examinations: Step 1, Step 2 Clinical Knowledge, and Step 3. For each multiple-choice question on each Step examination, various testing features are available to examinees. These features include the ability to 1) highlight examination text, 2) cross out examination text 3) take notes in a blank window, and 4) view lab values. This project will examine the associations among performing these activities and performance on the examinations. Inferential statistical analyses will be done for all examinees as well as by examinee subgroup, defined by such characteristics as gender and location of medical school. Among other responsibilities, an intern for this project would help to design the study, manage and manipulate a large complex data set, conduct statistical analyses, and interpret and present/describe results. 

2. Virtual Performance Assessment. The United States Medical Licensing Examination (USMLE) Step 2 Clinical Skills Examination (Step 2 CS), in existence since 2004, is undergoing a revitalization. Before the planned rollout of the restructured exam, the NBME will be conducting multiple research studies. Possible projects include work on generalizability, quality assurance metrics, or evaluation of scoring/equating designs for this virtual examination. 

3. What Determines Physician Competence? The USMLE assesses the competence level of future physicians based on the system of tests that check the breadth and depth of medical knowledge, as well as clinical and doctor-patient communication skills. In addition to the USMLE scores, a plethora of exam scores are collected throughout medical school and residency. The goal of this project is to examine the relationships between the various intermediate educational medical tests and the ultimate physician competence indicators. 

4. Interactive Psychometric Dashboards. Contemporary methods of communicating score information are evolving beyond fixed displays on traditional 8 ½ by 11-inch page layouts. Additionally, score users demand more flexibility and specificity to draw tailored inferences in support of evidence-based decisions. For this project the intern will contribute to the development of dashboard products using R Shiny and participate in engagement efforts with internal and external customers. 

5. Using NLP to Explore Item Bias and Test Validity. The use of Natural Language Processing (NLP) within applications, in conjunction with psychometric theory and statistics, has shown great promise for providing innovative information that was not previously available in the item and test development processes of high-stakes licensure programs. This project will focus on the use of basic NLP techniques and application development intended to analyze text in large item banks, to improve item and test development and provide more evidence for test fairness and the validity of test scores. An intern with little or no NLP experience, but who is interested in learning more about this field, may be suitable. Interns with previous NLP experience will have the possibility of incorporating more advanced NLP techniques and analyses into their work, depending on the project scope and other projects underway at NBME. 

6. Measurement Instrument Revision and Development. This project will continue ongoing work on revising a commonly used measurement instrument so that the appropriate inferences can be made about medical student well-being. Duties may include the following: working with subject-matter experts to revise the existing items; conducting think-alouds with medical students; developing a pilot measure of potential items; exploratory and confirmatory factor analysis of initial pilot results to gather structural validity evidence; developing a larger survey to gather concurrent and discriminate validity evidence with the revised measure; and administration and evaluation of the larger survey. 

7. Qualitative Analysis of Focus Group and Interview Data. The RENEW (Re-examining Exams: NBME Effort on Wellness) task force at NBME is focused on understanding the relationship between the pressures of high-stakes licensure examinations and medical student well-being. As part of this work, a series of focus groups and interviews have been completed, but this data has not been exhaustively utilized and analyzed. The intern assigned to this project will apply qualitative and mixed-method methodologies to learn more about the information gathered during the focus groups and interviews. 

8. Computer-Assisted Scoring of Constructed Response Test Items. Relying on subject matter experts to score constructed-response test items is expensive, time-consuming, and introduces natural scoring variations inherent when using human raters. Recently the NBME has developed a computer-assisted scoring program that utilizes natural language processing (NLP) to mitigate these issues. The two main components of the program are (1) ensuring that the information in the constructed response is correctly identified and represented; and (2) building a scoring model based on these concept representations. Current areas of research surrounding this project include (but are not limited to): refining quality control steps to be taken prior to an item being used in computer-assisted scoring; linking and equating computer-assisted scores with human rater scores; evaluating a scoring method based on using orthogonal arrays; and developing metrics that assess item quality and test reliability when computer-assisted scores and human scores are used to make classification decisions. The final project will be determined based on a combination of intern interest and project importance. 

9. Modern Test Construction Methods. The art and science of constructing an efficient and effective high-stakes exam requires balancing several competing aims. These theoretical and statistical considerations are explicitly enumerated when constructing an exam, resulting in a set of constraints that require complex algorithms to build secure test forms. Recent research has suggested that some of these constraints may be unnecessarily restrictive, resulting in test item banks that aren’t efficiently utilized and increased costs for test development and construction. An intern for this project would review current practices for exam construction and examine which constraints may be relaxed while still administering an exam with appropriate domain coverage and exam security considerations. 

Application 

Candidates may apply by going to https://nbme.applicantpro.com/jobs/. A cover letter outlining experience and listing project interests by number, along with a current resume, are required. Application deadline is January 31, 2021. 

All applicants will be notified of selection decisions by February 26, 2020. 

Overview 

NBME offers a versatile selection of high-quality assessments and educational services for students, professionals, educators, and institutions dedicated to the evolving needs of medical education and health care. To serve these communities, we collaborate with a comprehensive array of professionals including test developers, academic researchers, scoring experts, 

practicing physicians, medical educators, state medical board members, and public representatives. 

Together with the Federation of State Medical Boards, NBME develops and manages the United States Medical Licensing Examination. In addition, we are committed to meeting the needs of educators and learners globally with assessment products and expert services such as Subject Examinations, Customized Assessment Services, Self-Assessments, the International Foundations of Medicine , and Item-Writing Workshops. 

We also provide medical education funding and mentorship through the Latin America Grants, Stemmler Fund, and Strategic Educators Enhancement Fund, which serve to advance assessment at educators' and health professionals' own institutions. 

NBME employs approximately 30 doctoral level psychometricians and assessment scientists, as well as several MDs specializing in medical education. Staff is recognized internationally for its expertise in statistical analysis, psychometrics, and test development. 

Learn more about NBME at NBME.org. 

Diversity, Equity, and Inclusion: 

At NBME, we continue to innovate and improve how we fulfill the evolving needs of the health care community. This commitment starts and ends with the people at NBME. By recruiting and empowering talented individuals from various disciplines and backgrounds, which includes professionals with diverse life experiences, abilities, and perspectives, NBME can take a well-informed, robust approach to advancing medical education and assessment for years to come. Learn more about NBME at NBME.org. 

 

The National Commission on Certification of Physician Assistants (NCCPA)

Position Summary: The National Commission on Certification of Physician Assistants (NCCPA) is offering an eight-week internship for students currently working toward their Ph.D. in psychometrics (or other relevant fields), with at least two years of graduate coursework. During the program, the intern will have the opportunity to gain experience in operational psychometric tasks involved in administering and scoring a certification assessment and to take the lead on a research project, in collaboration with psychometric staff. The research effort will include submission of a proposal to NCME, AERA, or similar conference, and will culminate in a research paper that can be delivered at that conference.

Candidates will undertake a project that meets their interests and skills and supports NCCPA’s exam-related research agenda. Current topics of interest to NCCPA revolve largely around our recently concluded, two-year, longitudinal assessment pilot. Specific areas of interest include automated item generation (AIG), multi-stage testing (MST), enemy item identification using machine learning methods, differential item functioning (DIF), small-sample equating, and standard setting.

This program is scheduled to begin in June and to conclude by the end of August. While interns will work remotely, during week one of the internship, they will work closely with the Psychometric team to develop a research plan for the remaining weeks. This initial project development week may be at the NCCPA offices in Johns Creek, GA, conditions permitting. In the event of an on-site experience, travel and lodging expenses will be covered by NCCPA. Over the following weeks, interns will convene in virtual weekly meetings with a mentor to discuss progress and to address questions. All finalized deliverables will be provided to NCCPA at the completion of the internship.

Application Procedure: A complete application includes a curriculum vitae, student copy of graduate school transcript (does not need to be an official transcript), two letters of recommendation, and a statement of purpose describing your interest in the internship as well as your general research interests. Application materials may be emailed or mailed to NCCPA and must be received by February 16, 2021. The internship award will be announced by March 2, 2021. The award includes a $6,000 stipend. Conference travel will be offset, and travel/lodging for trips to the NCCPA offices will be reimbursed in accordance with NCCPA’s policies.

Materials and/or questions should be submitted to:
J. B. Weir, Ph.D., Psychometrician National Commission on Certification of Physician Assistants
12000 Findley Road, Suite 100 Johns Creek, GA, 30097
Phone: 678-417-8173
Email: johnw@nccpa.net

ETS Harold Gulliksen Psychometric Research Fellowship 

ETS Harold Gulliksen Psychometric Research Fellowship

Description

The fellowship provides funding for promising graduate students in psychometrics, or a related field, who are conducting innovative, applied research. Selected fellows will carry out their proposed research under the mentorship of their academic advisor and in consultation with their ETS mentors over the course of the funding. The fellowship begins with the selected fellows working on their research as part of the Summer RMS Internship Program, where they will be provided with guidance from ETS’s leading researchers. During the subsequent academic year, fellows study at their universities and carry out research under the supervision of an academic mentor and in consultation with their ETS mentor.

Program Goals

The goal of this program is to increase the number of well-trained scientists in educational assessment, psychometrics and statistics.

Important Dates

    • January 22, 2021 — Deadline for receipt of preliminary application materials
    • February 5, 2021 — Applicants are notified of preliminary application decision
    • March 5, 2021 — Deadline for receipt of final application materials
    • March 31, 2021 — Award recipients are notified

Award Duration

Appointments are for one year.


Award Value

Each fellow's university receives the following:

  • $20,000 to pay a stipend to the fellow
  • $8,000 to defray the fellow's tuition, fees and work-study program commitments
  • A small grant to facilitate work on the fellow's research project

Selected fellow must participate in the Summer RMS Internship Program in Research for Graduate Students.

Eligibility

At the time of application, candidates must be enrolled in a doctoral program, have completed all the coursework toward the doctorate, and be at the dissertation stage of their program. Dissertation topics in the areas of psychometrics, statistics, educational measurement or quantitative methods will be given priority. At the time of application, candidates will be asked to provide a statement describing any additional financial assistance such as assistantship or grant commitment that he/she will have during the fellowship period.

Selection

Selection is based on:

  • Strength of the applicant's academic credentials. Applicants need to demonstrate superior academic ability and achievement, as well as exceptional promise in the field of measurement, psychometrics or statistics.
  • Suitability and the technical strength of the proposed research project. The project must relate to research currently under way at ETS. The preferred arrangement is that the proposed project be the applicant's doctoral thesis and that the applicant is not receiving alternative funding for this work. Non-dissertation projects may be considered provided that the applicant is doing significant independent work.

Application Procedures

Submit all application materials via email with PDF attachments.

Preliminary Application

  • Letter of interest describing the research that would be undertaken during the award year and how the research fits with ETS research efforts
  • Statement describing any additional financial assistance, such as assistantship or grant commitment, that you would have during the fellowship period
  • Nomination letter (either as an email or as an email with a PDF attachment) from an academic advisor in support of your interest in the fellowship award
  • Current curriculum vitae

Final Application

If your preliminary application is approved, you will be invited to submit the following materials:

  • Detailed project description (approximately 15 double-spaced pages) of the research the individual will carry out at the host university, including the purpose, goals and methods of the research
  • Graduate academic transcripts (unofficial copies are acceptable)
  • Evidence of scholarship (presentations, manuscripts, etc.)

Contact

For more information, contact us via email.

Please find the announcement here:

https://www.ets.org/research/internship-fellowship/gulliksen