Internships & Fellowships



         
          The following information is maintained by the The Graduate Student Issues Committee (GSIC). 

          If you would like to have your internship or fellowship listed here please contact our GSIC co-chair:
          Maura O'Riordan: moriordan@umass.edu 
   


Graduate Student Internships

Internships are a valuable way to link your academic experience with the professional arena. Below is a list of internships that will allow students to go beyond the classroom and conduct practical research with a mentor from a testing company or research agency.


Graduate Student Fellowships

Fellowships provide structured work experience and professional development that include intensive training and experiential learning. Below is a list of fellowships that provide support to the fellow's growth and opportunities to explore a particular field of measurement.

American Board of Internal Medicine 

American Board of Internal Medicine Internship

Summer 2021 Psychometric Internship Announcement 

The ABIM is pleased to announce the return of its summer internship in psychometrics. 

Internship Opportunity 

The ABIM’s psychometric internship program is an eight-week summer internship running from Monday, June 14th to Friday, August 6th in Philadelphia, PA*. During the program, the intern will take primary ownership of an applied psychometric research project under the guidance of one of the ABIM’s measurement scientists. The intern will also have opportunities to assist psychometric staff on other research projects and to learn about operational processes (e.g., item analysis, IRT calibration, equating) within the context of medical certification. 

* If necessary for health and safety reasons, the ABIM summer internship will be conducted remotely. ABIM will communicate with candidates about such decisions in the spring. 

† If the internship is conducted remotely, interns will not receive the housing allowance. 

Qualifications 

  • Doctoral student in an educational measurement (or related field) program with at least two years of coursework completed by the start of the internship 
  • Preference will be given to applicants who have experience with item response theory
  • Excellent communication skills
  • Interest in certification testing
  • Eligible to be legally employed in the United States 

Stipend 

The ABIM provides a total of $10,000 for the eight-week internship program. This total includes an $8,000 stipend as well as a $2,000 housing allowance. 

Research Projects 

For their primary research project, the intern should expect to perform all stages of the research process, from literature review to discussion and dissemination of results. At the conclusion of the program, the intern will be expected to share their results by giving a brief presentation to an audience of psychometric staff. Further, the intern will be encouraged to submit their summer project(s) for presentation at a professional conference and/or for publication. The intern will work with their mentor to select an appropriate project for their experience level and interests. Examples of previously completed internship projects can be found on the next page. 

Application 

Please submit your curriculum vitae and a letter of interest to Michele Johnson, Research Program Manager (researchintern@abim.org) by Monday, February 1st 2021. 

Examples of Previously Completed Internship Projects 

Please note these are examples of previous ABIM Interns’ projects. The summer 2021 internship project will be determined jointly by the 2021 mentor and summer intern. 

  • Anchor Item Replacement in the Presence of Consequential Item Parameter Drift. The purpose of this project was to develop evidence-based recommendations for replacing anchor items when a significant number are flagged for exhibiting item parameter drift. The intern conducted a simulation study that investigated different item replacement strategies, and examined the effects of each strategy with respect to outcomes such as pass/fail classification accuracy and RSMD of thetas. 

Chang, K., & Rewley, K. Currently under review. 

  • Evaluating Use of an Online Open-Book Resource in a High Stakes Credentialing Exam. This project examined item and examinee characteristics associated with the use of an open-book resource throughout a high-stakes medical certification exam. Using exam process data and a generalized estimation equations modeling framework, the intern examined use of the open-book resource and how it might affect examinees’ test-taking experience and performance. 

Myers, A. & Bashkov B. Paper presented at NCME 2020. 

  • Investigating the Impact of Parameter Instability on IRT Proficiency Estimation. This project examined how poorly estimated item parameters impact different proficiency estimators. The intern conducted a simulation study to examine how different levels of parameter instability impact Bayesian vs. non-Bayesian estimators as well as pattern vs. summed-score estimators. 

McGrath, K. & Smiley, W. Paper presented at NCME 2019. 

  • Using Data Visualization to Explore Test Speededness in Certification Exams. This project examined different ways to determine if a test is speeded. The intern conducted a thorough literature review of methods used to detect and quantify test speededness. She then used data visualization, a nonparametric tool that does not have any data assumptions, to examine operational test data for speededness. This was shown to be a viable approach to assessing the impact of examination timing. 

Sullivan, M. & Bashkov, B. Paper presented at TIME 2017. 

  • Automatic Flagging of Items for Key Validation. This project developed an automatic method for determining if there is a problem with an item’s key. The intern collected data from psychometricians regarding which items required key validation and used logistic regression to mimic that professional judgement to automatically flag problematic items. 

Sahin, F. & Clauser, J. Paper presented at NCME 2016 

National Board of Medical Examiners 

National Board of Medical Examiners Internship

Summer 2021 Internships in Assessment Science and Psychometrics

June 7 - July 30, 2021   Philadelphia, PA

This year’s internship will be virtual, due to uncertainty surrounding the Covid-19 pandemic. 

Interns will interact with other graduate students and NBME staff and will present completed projects or work-in-progress to NBME staff. Internships typically result in conference presentations (e.g., NCME) and sometimes lead to publication or dissertation topics. 

Requirements 

  • Active enrollment in doctoral program in measurement, statistics, cognitive science, medical education, or related field; completion of two or more years of graduate coursework.
  • Experience or coursework in one or more of the following: test development, IRT, CTT, statistics, research design, and cognitive science. Advanced knowledge of topics such as equating, generalizability theory, or Bayesian methodology is helpful. Skill in writing and presenting research. Working knowledge of statistical software (e.g., Winsteps, BILOG; SPSS, SAS, or R).
  • Interns will be assigned to one or more mentors but must be able to work independently.
  • Must be authorized to work in the US for any employer. If selected, F-1 holders will need to apply for Curricular Practical Training authorization through their school’s international student office and have a social security number for payroll purposes. 

Compensation 

Total compensation for the two months is approximately $8000. 

Research Projects 

Interns will help define a research problem; review related studies; conduct data analyses (real and/or simulated data); and write a summary report suitable for presentation. Projects are summarized below. Applicants should identify 2 projects by number that they prefer to work on. 

1. Exploring Response Process Validity Evidence for a Medical Licensing Examination Program. Passing scores on the United States Medical Examination (USMLE) sequence are intended to signal readiness for the unsupervised practice of medicine in the United States. This project will examine response process validity evidence for USMLE score interpretations. USMLE is comprised of three computer-based multiple-choice question examinations: Step 1, Step 2 Clinical Knowledge, and Step 3. For each multiple-choice question on each Step examination, various testing features are available to examinees. These features include the ability to 1) highlight examination text, 2) cross out examination text 3) take notes in a blank window, and 4) view lab values. This project will examine the associations among performing these activities and performance on the examinations. Inferential statistical analyses will be done for all examinees as well as by examinee subgroup, defined by such characteristics as gender and location of medical school. Among other responsibilities, an intern for this project would help to design the study, manage and manipulate a large complex data set, conduct statistical analyses, and interpret and present/describe results. 

2. Virtual Performance Assessment. The United States Medical Licensing Examination (USMLE) Step 2 Clinical Skills Examination (Step 2 CS), in existence since 2004, is undergoing a revitalization. Before the planned rollout of the restructured exam, the NBME will be conducting multiple research studies. Possible projects include work on generalizability, quality assurance metrics, or evaluation of scoring/equating designs for this virtual examination. 

3. What Determines Physician Competence? The USMLE assesses the competence level of future physicians based on the system of tests that check the breadth and depth of medical knowledge, as well as clinical and doctor-patient communication skills. In addition to the USMLE scores, a plethora of exam scores are collected throughout medical school and residency. The goal of this project is to examine the relationships between the various intermediate educational medical tests and the ultimate physician competence indicators. 

4. Interactive Psychometric Dashboards. Contemporary methods of communicating score information are evolving beyond fixed displays on traditional 8 ½ by 11-inch page layouts. Additionally, score users demand more flexibility and specificity to draw tailored inferences in support of evidence-based decisions. For this project the intern will contribute to the development of dashboard products using R Shiny and participate in engagement efforts with internal and external customers. 

5. Using NLP to Explore Item Bias and Test Validity. The use of Natural Language Processing (NLP) within applications, in conjunction with psychometric theory and statistics, has shown great promise for providing innovative information that was not previously available in the item and test development processes of high-stakes licensure programs. This project will focus on the use of basic NLP techniques and application development intended to analyze text in large item banks, to improve item and test development and provide more evidence for test fairness and the validity of test scores. An intern with little or no NLP experience, but who is interested in learning more about this field, may be suitable. Interns with previous NLP experience will have the possibility of incorporating more advanced NLP techniques and analyses into their work, depending on the project scope and other projects underway at NBME. 

6. Measurement Instrument Revision and Development. This project will continue ongoing work on revising a commonly used measurement instrument so that the appropriate inferences can be made about medical student well-being. Duties may include the following: working with subject-matter experts to revise the existing items; conducting think-alouds with medical students; developing a pilot measure of potential items; exploratory and confirmatory factor analysis of initial pilot results to gather structural validity evidence; developing a larger survey to gather concurrent and discriminate validity evidence with the revised measure; and administration and evaluation of the larger survey. 

7. Qualitative Analysis of Focus Group and Interview Data. The RENEW (Re-examining Exams: NBME Effort on Wellness) task force at NBME is focused on understanding the relationship between the pressures of high-stakes licensure examinations and medical student well-being. As part of this work, a series of focus groups and interviews have been completed, but this data has not been exhaustively utilized and analyzed. The intern assigned to this project will apply qualitative and mixed-method methodologies to learn more about the information gathered during the focus groups and interviews. 

8. Computer-Assisted Scoring of Constructed Response Test Items. Relying on subject matter experts to score constructed-response test items is expensive, time-consuming, and introduces natural scoring variations inherent when using human raters. Recently the NBME has developed a computer-assisted scoring program that utilizes natural language processing (NLP) to mitigate these issues. The two main components of the program are (1) ensuring that the information in the constructed response is correctly identified and represented; and (2) building a scoring model based on these concept representations. Current areas of research surrounding this project include (but are not limited to): refining quality control steps to be taken prior to an item being used in computer-assisted scoring; linking and equating computer-assisted scores with human rater scores; evaluating a scoring method based on using orthogonal arrays; and developing metrics that assess item quality and test reliability when computer-assisted scores and human scores are used to make classification decisions. The final project will be determined based on a combination of intern interest and project importance. 

9. Modern Test Construction Methods. The art and science of constructing an efficient and effective high-stakes exam requires balancing several competing aims. These theoretical and statistical considerations are explicitly enumerated when constructing an exam, resulting in a set of constraints that require complex algorithms to build secure test forms. Recent research has suggested that some of these constraints may be unnecessarily restrictive, resulting in test item banks that aren’t efficiently utilized and increased costs for test development and construction. An intern for this project would review current practices for exam construction and examine which constraints may be relaxed while still administering an exam with appropriate domain coverage and exam security considerations. 

Application 

Candidates may apply by going to https://nbme.applicantpro.com/jobs/. A cover letter outlining experience and listing project interests by number, along with a current resume, are required. Application deadline is January 31, 2021. 

All applicants will be notified of selection decisions by February 26, 2021.

Overview 

NBME offers a versatile selection of high-quality assessments and educational services for students, professionals, educators, and institutions dedicated to the evolving needs of medical education and health care. To serve these communities, we collaborate with a comprehensive array of professionals including test developers, academic researchers, scoring experts, 

practicing physicians, medical educators, state medical board members, and public representatives. 

Together with the Federation of State Medical Boards, NBME develops and manages the United States Medical Licensing Examination. In addition, we are committed to meeting the needs of educators and learners globally with assessment products and expert services such as Subject Examinations, Customized Assessment Services, Self-Assessments, the International Foundations of Medicine, and Item-Writing Workshops. 

We also provide medical education funding and mentorship through the Latin America Grants, Stemmler Fund, and Strategic Educators Enhancement Fund, which serve to advance assessment at educators' and health professionals' own institutions. 

NBME employs approximately 30 doctoral level psychometricians and assessment scientists, as well as several MDs specializing in medical education. Staff is recognized internationally for its expertise in statistical analysis, psychometrics, and test development. 

Learn more about NBME at NBME.org. 

Diversity, Equity, and Inclusion: 

At NBME, we continue to innovate and improve how we fulfill the evolving needs of the health care community. This commitment starts and ends with the people at NBME. By recruiting and empowering talented individuals from various disciplines and backgrounds, which includes professionals with diverse life experiences, abilities, and perspectives, NBME can take a well-informed, robust approach to advancing medical education and assessment for years to come. Learn more about NBME at NBME.org. 

 

Educational Testing Service I (ETS)

2021 Summer Research and Measurement Sciences (RMS)
Internship Program for Graduate Students

Description

If you are a creative and innovative individual who wants to help shape the future of learning and assessment, we encourage you to apply for the 2021 Summer Research and Measurement Sciences (RMS) Internship program. Steeped in decades of broad expertise, RMS conducts rigorous foundational and applied research on the most critical issues facing education and the workforce. Central to ETS’s legacy of global leadership in learning and assessment, RMS is dedicated to advancing the science and practice of measurement, driving innovation in digital assessment, learning and teaching.

Applying for an Internship at RMS

As an intern in RMS, you’ll work with experts who are nationally and internationally known as thought leaders, trusted advisors and go-to collaborators for their high-impact work addressing significant educational and societal goals. ETS staff in RMS have expertise in psychology, education, psychometrics, measurement, statistics, cognitive or learning sciences and data science.

Interns who are accepted into the program will collaborate with scientists on projects related to these topics and will participate in data analysis, writing and other research tasks. Doctoral students who have completed at least two years in one of these or a related field are encouraged to apply. Upon the completion of the program, you’ll have the opportunity to present your findings to teams across R&D.

Note: Applicants may apply to the RMS or AI Labs Internship programs, but not both. However, all applicants may be considered for both programs, depending on qualifications and project needs.

Application Procedures

Complete the electronic application form. On the application form:

  • Choose up to two research areas in which you are interested and provide written statements about your interest in the area(s) of research and how your experience aligns with the project.
  • Attach a copy of your curriculum vitae (preferably as a PDF).
  • If you are (or have been) actively enrolled in a graduate program, attach a copy of your graduate transcripts (unofficial copies are acceptable).
  • Download the recommendation form and share it with your recommenders. The link to the recommendation form is on the application. Recommendations should come from an academic advisor, a professor who is familiar with your work as it relates to the project of interest or an individual with whom you have worked on a closely aligned project. ETS will only accept two recommendation forms.Recommendations should be sent electronically to internfellowships@ets.org and must be received by February 1, 2021. If you would like to download the recommendation form to send to your recommenders before submitting your application, you can save your application information for completion later.

Deadline

  • The application deadline is February 1, 2021.

Decisions

  • Applicants will be notified of selection decisions by March 31, 2021.

Duration

  • Ten weeks: June 1, 2021–August 6, 2021

Compensation

  • $7,500 salary

Eligibility

  • Current full-time enrollment in a relevant doctoral program
  • Completion of at least two years of coursework toward the doctorate prior to the program start date

Selection

The main criteria for selection will be scholarship and the match of applicant interests and experience with the research projects.

ETS affirmative action goals will be considered. We strongly encourage students from under represented groups and backgrounds to apply. Late or incomplete applications will not be considered.

For more information please visit:

www.ets.org/research/internship-fellowship/summer-rms-intership

Educational Testing Service II (ETS)

2021 Summer AI Research Labs Internship Program
Description

The AI Research Labs’ work drives the innovation and development of teaching and learning technologies that are grounded in research and powered by next-generation AI. The Labs are dedicated to working closely with end-users to uncover real-world needs, and co-designing and prototyping solutions to meet those needs. Our staff is made up of research and learning scientists, software developers, research engineers, user experience researchers and designers, instructional designers and producers, and product owners.

Applying for an Internship in the AI Research Labs

Interns accepted into the AI Research Lab summer program will participate in user needs, discovery and exploration, solution ideation and validation, capability and prototype development, iterative user validation and data-driven solution optimization. We work in agile development teams to apply the best of foundational learning and cognitive science to the design, development and testing of solutions to meet educator and learner needs. Upon the completion of the program you’ll have the opportunity to present your findings to teams across R&D.

Applicants who have interest and expertise in the following would be a great fit for this program:

  • learning or cognitive science
  • software development
  • AI and ML engineering
  • user experience research and/or design
  • instructional design
  • product ownership

Note: Applicants may apply to the RMS or AI Labs Internship programs, but not both. However, all applicants may be considered for both programs, depending on qualifications and project needs.

Application Procedures

Complete the electronic application form. On the application form:

  • Identify the Lab that you are interested in and provide a written statement about your interest in the area(s) of research in the Lab and how your education and experience align with the work of the Lab.
  • Attach a copy of your curriculum vitae (preferably as a PDF).
  • If you are (or have been) actively enrolled in a graduate program attach a copy of your graduate transcripts (unofficial copies are acceptable).
  • If you were accepted into a graduate program and deferred enrollment, attach proof of acceptance.
  • Download the recommendation form and share it with your recommenders. The link to the recommendation form is on the application. Recommendations should come from an academic advisor, a professor who is familiar with your work as it relates to the project of interest, or an individual who you have worked with on a closely aligned project. ETS will only accept two recommendation forms. Recommendations should be sent electronically to internfellowships@ets.org and must be received by February 1, 2021. If you would like to download the recommendation form for sending to your recommenders before submitting your application, the option to save your application information for later is available.

Deadline

  • The application deadline is February 1, 2021.

Decisions

  • Applicants will be notified of selection decisions by March 31, 2021.

Duration

  • Ten weeks: June 1, 2021–August 6, 2021

Compensation

  • $7,500 salary

Eligibility

  • Completion of bachelor's degree
  • Actively enrolled or accepted into a graduate program aligned to a Lab focus (students who have deferred enrollment due to extenuating circumstances will be considered)

Selection

The main criteria for selection will be the match of applicant interests and experience with the focus of the Labs.

ETS affirmative action goals will be considered. The Research Labs value building teams of individuals from diverse backgrounds and with diverse experiences. We strongly encourage students from underrepresented groups and backgrounds to apply. Late or incomplete applications will not be considered.

For more information please visit:

https://www.ets.org/research/internship-fellowship/summer-ai-research-labs-internship

Educational Testing Service III (ETS)

National Assessment of Educational Progress (NAEP) Internships

ETS-NAEP Summer Internship Opportunities

ETS Research & Development is committed to developing a talent pipeline of diverse researchers and scientists in data science, data analysis, psychometrics, test fairness, validity, measurement and statistics. Students can apply for learning opportunities at the undergraduate, graduate or post-doctoral level through externships, internships, fellowships or post-doctoral appointments. All appointments within the ETS-NAEP Program are based on a business research project model and designed to expose individuals to our mission and culture, the day-to-day operations and value of working in educational testing and assessment, and how the National Center for Educational Statistics (NCES)-NAEP and ETS-NAEP collaborate to support a dynamic national educational imperative.

The Internship Experience

As part of our 2021 Summer internship experience, our interns will work virtually for an eight-week period on a business research project and will be paired with an ETS R&D NAEP staff member who will serve as their mentor. Since our program is designed to expose students to careers in our field, our interns also receive learning and professional development experiences that include, but are not limited to:

  • individual professional mentoring/coaching
  • weekly research seminars led by experts in the fields of research, measurement, psychometrics and statistics
  • exposure to connect and learn from leaders across NCES-NAEP
  • professional presentation opportunities and exposure to opportunities for future full-time employment in ETS-NAEP

Interested students can apply for one of two internships.

The Summer Undergraduate Research Experience (SURE) — Undergraduate Students

We provide business research project-based experiences for diverse students starting from those who are entering their third year of study through recent (within the past semester) graduates. Students must be enrolled in a four-year United States-based accredited institution with at least one year of classroom and/or external research experience. All majors are welcomed.

Apply for the SURE internship

The Summer Pre-Doctoral Research Experience (SPRE) — Graduate Students

Individuals currently enrolled in master's or doctoral-level programs will engage in learning experiences designed to apply their educational training with applied experiences working on a business research project. Diverse students majoring in degree programs in psychology, psychometrics, data science, learning science, computer science, cognitive science, artificial intelligence and/or machine learning who have completed at least two years of coursework are encouraged to apply.

Apply for the SPRE internship

ETS-NAEP University Partnership Efforts

NCES-NAEP and ETS Research & Development have shared missions to inform, learn from, support and engage the communities where we work and live in the key areas within NAEP: Psychometrics & Data Analysis, Assessment Design & Development, and Survey Instrument Design. As part of this ongoing commitment, we work with colleges and universities that have programs/majors that align with the business areas within NAEP to provide students and faculty with learning, professional development and employment opportunities that expose them to NAEP’s work.

ETS and Howard University

Established in 2012, the ETS-NAEP Statistics & Evaluation Institute provides students, faculty and staff with no-cost learning opportunities in the areas of research, assessment, and quantitative and qualitative research. Over the eight-week period at the Institute, participants attend daily seminars that consist of a hybrid of lecture/classroom and applied/lab learning. This model encourages individuals to actively learn and then apply the knowledge gained in an iterative manner, and includes ongoing feedback and coaching from the faculty. Since the program is supported by NCES-NAEP, participants also learn directly from staff who work within NAEP and they are provided with opportunities to connect with us for visiting and full-time employment positions.

Human Resources Research Organization

Internship in Educational Measurement

HumRRO

Since 1951, HumRRO has applied social science research to enhance human performance
and organizational effectiveness. Our 90-plus professional researchers hold advanced degrees in industrial-organizational, experimental, quantitative, or social psychology; education research and measurement; or other social science disciplines. HumRRO is an independent, nonprofit organization that conducts human resource research and analyses for Federal, state, and local agencies. Our clients also include professional associations and private sector companies. To learn more about HumRRO, visit our website at
www.humrro.org.

Location

Summer interns will have the opportunity to work under the direction and supervision of senior staff in one of HumRRO’s offices. HumRRO headquarters is located in Alexandria, Virginia—just outside of our nation’s capital. Our office overlooks the Potomac River in historic Old Town. Other HumRRO offices are in Louisville, Kentucky; Monterey, California; and Minneapolis, Minnesota.

Internships

HumRRO’s internships are available to students currently enrolled in a full-time Educational Assessment (or related fields) master’s or Ph.D. accredited program. Our summer internships are paid, full-time opportunities and include paid vacation and a housing stipend. Internship duties and responsibilities vary depending upon the technical content and timeline of the particular projects in need of staffing. Typical tasks include literature review, synthesis, and analysis; data collection, entry, and analysis; survey and other instrument development and administration; and documentation of findings.

HumRRO is looking for responsible, motivated team players with strong technical and communication skills to assist in our contract research activities. HumRRO project directors will provide technical supervision and mentoring to help make the most of the intern’s exposure to, and experience with, our applied research setting.

Selection

Selection is competitive and will be based on a holistic review of application materials for evidence
of communication and interpersonal skills, research experience and promise, initiative and motivation, academic coursework, and professional interests. We seek applicants who have a background in research methods and some experience conducting research (e.g., working on a research team, presenting as a co-author at a conference). Strong applicants also demonstrate their ability to participate in research activities, such as writing literature reviews, collecting data, and/or analyzing data. Finalists will be interviewed telephonically.

Application Process

Deadline: HumRRO must receive all applications for the summer internship on or before: January 15

Each applicant must provide:

  1. A curriculum vitae/detailed resume

  2. A one-page personal statement of career goals and internship interests

  3. Official transcript(s) of all graduate work

  4. Contact information for two references (letters of recommendation are not needed)

  5. Completed Statistical Packages and Data Analysis Questionnaire (link on website below)

Applications should be emailed to: internship@humrro.org

You can find more information and FAQs by visiting: https://www.humrro.org/corpsite/internships/.

 




National Board of Medical Examiners Internship

Summer 2021 Internships in Assessment Science and Psychometrics

June 7 - July 30, 2021   Philadelphia, PA

This year’s internship will be virtual, due to uncertainty surrounding the Covid-19 pandemic. 

Interns will interact with other graduate students and NBME staff and will present completed projects or work-in-progress to NBME staff. Internships typically result in conference presentations (e.g., NCME) and sometimes lead to publication or dissertation topics. 

Requirements 

  • Active enrollment in doctoral program in measurement, statistics, cognitive science, medical education, or related field; completion of two or more years of graduate coursework.
  • Experience or coursework in one or more of the following: test development, IRT, CTT, statistics, research design, and cognitive science. Advanced knowledge of topics such as equating, generalizability theory, or Bayesian methodology is helpful. Skill in writing and presenting research. Working knowledge of statistical software (e.g., Winsteps, BILOG; SPSS, SAS, or R).
  • Interns will be assigned to one or more mentors but must be able to work independently.
  • Must be authorized to work in the US for any employer. If selected, F-1 holders will need to apply for Curricular Practical Training authorization through their school’s international student office and have a social security number for payroll purposes. 

Compensation 

Total compensation for the two months is approximately $8000. 

Research Projects 

Interns will help define a research problem; review related studies; conduct data analyses (real and/or simulated data); and write a summary report suitable for presentation. Projects are summarized below. Applicants should identify 2 projects by number that they prefer to work on. 

1. Exploring Response Process Validity Evidence for a Medical Licensing Examination Program. Passing scores on the United States Medical Examination (USMLE) sequence are intended to signal readiness for the unsupervised practice of medicine in the United States. This project will examine response process validity evidence for USMLE score interpretations. USMLE is comprised of three computer-based multiple-choice question examinations: Step 1, Step 2 Clinical Knowledge, and Step 3. For each multiple-choice question on each Step examination, various testing features are available to examinees. These features include the ability to 1) highlight examination text, 2) cross out examination text 3) take notes in a blank window, and 4) view lab values. This project will examine the associations among performing these activities and performance on the examinations. Inferential statistical analyses will be done for all examinees as well as by examinee subgroup, defined by such characteristics as gender and location of medical school. Among other responsibilities, an intern for this project would help to design the study, manage and manipulate a large complex data set, conduct statistical analyses, and interpret and present/describe results. 

2. Virtual Performance Assessment. The United States Medical Licensing Examination (USMLE) Step 2 Clinical Skills Examination (Step 2 CS), in existence since 2004, is undergoing a revitalization. Before the planned rollout of the restructured exam, the NBME will be conducting multiple research studies. Possible projects include work on generalizability, quality assurance metrics, or evaluation of scoring/equating designs for this virtual examination. 

3. What Determines Physician Competence? The USMLE assesses the competence level of future physicians based on the system of tests that check the breadth and depth of medical knowledge, as well as clinical and doctor-patient communication skills. In addition to the USMLE scores, a plethora of exam scores are collected throughout medical school and residency. The goal of this project is to examine the relationships between the various intermediate educational medical tests and the ultimate physician competence indicators. 

4. Interactive Psychometric Dashboards. Contemporary methods of communicating score information are evolving beyond fixed displays on traditional 8 ½ by 11-inch page layouts. Additionally, score users demand more flexibility and specificity to draw tailored inferences in support of evidence-based decisions. For this project the intern will contribute to the development of dashboard products using R Shiny and participate in engagement efforts with internal and external customers. 

5. Using NLP to Explore Item Bias and Test Validity. The use of Natural Language Processing (NLP) within applications, in conjunction with psychometric theory and statistics, has shown great promise for providing innovative information that was not previously available in the item and test development processes of high-stakes licensure programs. This project will focus on the use of basic NLP techniques and application development intended to analyze text in large item banks, to improve item and test development and provide more evidence for test fairness and the validity of test scores. An intern with little or no NLP experience, but who is interested in learning more about this field, may be suitable. Interns with previous NLP experience will have the possibility of incorporating more advanced NLP techniques and analyses into their work, depending on the project scope and other projects underway at NBME. 

6. Measurement Instrument Revision and Development. This project will continue ongoing work on revising a commonly used measurement instrument so that the appropriate inferences can be made about medical student well-being. Duties may include the following: working with subject-matter experts to revise the existing items; conducting think-alouds with medical students; developing a pilot measure of potential items; exploratory and confirmatory factor analysis of initial pilot results to gather structural validity evidence; developing a larger survey to gather concurrent and discriminate validity evidence with the revised measure; and administration and evaluation of the larger survey. 

7. Qualitative Analysis of Focus Group and Interview Data. The RENEW (Re-examining Exams: NBME Effort on Wellness) task force at NBME is focused on understanding the relationship between the pressures of high-stakes licensure examinations and medical student well-being. As part of this work, a series of focus groups and interviews have been completed, but this data has not been exhaustively utilized and analyzed. The intern assigned to this project will apply qualitative and mixed-method methodologies to learn more about the information gathered during the focus groups and interviews. 

8. Computer-Assisted Scoring of Constructed Response Test Items. Relying on subject matter experts to score constructed-response test items is expensive, time-consuming, and introduces natural scoring variations inherent when using human raters. Recently the NBME has developed a computer-assisted scoring program that utilizes natural language processing (NLP) to mitigate these issues. The two main components of the program are (1) ensuring that the information in the constructed response is correctly identified and represented; and (2) building a scoring model based on these concept representations. Current areas of research surrounding this project include (but are not limited to): refining quality control steps to be taken prior to an item being used in computer-assisted scoring; linking and equating computer-assisted scores with human rater scores; evaluating a scoring method based on using orthogonal arrays; and developing metrics that assess item quality and test reliability when computer-assisted scores and human scores are used to make classification decisions. The final project will be determined based on a combination of intern interest and project importance. 

9. Modern Test Construction Methods. The art and science of constructing an efficient and effective high-stakes exam requires balancing several competing aims. These theoretical and statistical considerations are explicitly enumerated when constructing an exam, resulting in a set of constraints that require complex algorithms to build secure test forms. Recent research has suggested that some of these constraints may be unnecessarily restrictive, resulting in test item banks that aren’t efficiently utilized and increased costs for test development and construction. An intern for this project would review current practices for exam construction and examine which constraints may be relaxed while still administering an exam with appropriate domain coverage and exam security considerations. 

Application 

Candidates may apply by going to https://nbme.applicantpro.com/jobs/. A cover letter outlining experience and listing project interests by number, along with a current resume, are required. Application deadline is January 31, 2021. 

All applicants will be notified of selection decisions by February 26, 2020. 

Overview 

NBME offers a versatile selection of high-quality assessments and educational services for students, professionals, educators, and institutions dedicated to the evolving needs of medical education and health care. To serve these communities, we collaborate with a comprehensive array of professionals including test developers, academic researchers, scoring experts, 

practicing physicians, medical educators, state medical board members, and public representatives. 

Together with the Federation of State Medical Boards, NBME develops and manages the United States Medical Licensing Examination. In addition, we are committed to meeting the needs of educators and learners globally with assessment products and expert services such as Subject Examinations, Customized Assessment Services, Self-Assessments, the International Foundations of Medicine , and Item-Writing Workshops. 

We also provide medical education funding and mentorship through the Latin America Grants, Stemmler Fund, and Strategic Educators Enhancement Fund, which serve to advance assessment at educators' and health professionals' own institutions. 

NBME employs approximately 30 doctoral level psychometricians and assessment scientists, as well as several MDs specializing in medical education. Staff is recognized internationally for its expertise in statistical analysis, psychometrics, and test development. 

Learn more about NBME at NBME.org. 

Diversity, Equity, and Inclusion: 

At NBME, we continue to innovate and improve how we fulfill the evolving needs of the health care community. This commitment starts and ends with the people at NBME. By recruiting and empowering talented individuals from various disciplines and backgrounds, which includes professionals with diverse life experiences, abilities, and perspectives, NBME can take a well-informed, robust approach to advancing medical education and assessment for years to come. Learn more about NBME at NBME.org. 

 

The National Commission on Certification of Physician Assistants (NCCPA)

Position Summary: The National Commission on Certification of Physician Assistants (NCCPA) is offering an eight-week internship for students currently working toward their Ph.D. in psychometrics (or other relevant fields), with at least two years of graduate coursework. During the program, the intern will have the opportunity to gain experience in operational psychometric tasks involved in administering and scoring a certification assessment and to take the lead on a research project, in collaboration with psychometric staff. The research effort will include submission of a proposal to NCME, AERA, or similar conference, and will culminate in a research paper that can be delivered at that conference.

Candidates will undertake a project that meets their interests and skills and supports NCCPA’s exam-related research agenda. Current topics of interest to NCCPA revolve largely around our recently concluded, two-year, longitudinal assessment pilot. Specific areas of interest include automated item generation (AIG), multi-stage testing (MST), enemy item identification using machine learning methods, differential item functioning (DIF), small-sample equating, and standard setting.

This program is scheduled to begin in June and to conclude by the end of August. While interns will work remotely, during week one of the internship, they will work closely with the Psychometric team to develop a research plan for the remaining weeks. This initial project development week may be at the NCCPA offices in Johns Creek, GA, conditions permitting. In the event of an on-site experience, travel and lodging expenses will be covered by NCCPA. Over the following weeks, interns will convene in virtual weekly meetings with a mentor to discuss progress and to address questions. All finalized deliverables will be provided to NCCPA at the completion of the internship.

Application Procedure: A complete application includes a curriculum vitae, student copy of graduate school transcript (does not need to be an official transcript), two letters of recommendation, and a statement of purpose describing your interest in the internship as well as your general research interests. Application materials may be emailed or mailed to NCCPA and must be received by February 16, 2021. The internship award will be announced by March 2, 2021. The award includes a $6,000 stipend. Conference travel will be offset, and travel/lodging for trips to the NCCPA offices will be reimbursed in accordance with NCCPA’s policies.

Materials and/or questions should be submitted to:
J. B. Weir, Ph.D., Psychometrician National Commission on Certification of Physician Assistants
12000 Findley Road, Suite 100 Johns Creek, GA, 30097
Phone: 678-417-8173
Email: johnw@nccpa.net

National Center for the Improvement of Educational Assessment

2021 Summer Internship Program in Educational Assessment and Accountability
The National Center for the Improvement of Educational Assessment, Inc. (the Center) is a small nonprofit organization that occupies a unique and influential niche at the intersection of educational measurement and educational assessment policy. The Center is pleased to offer up to four (4) summer internships for advanced doctoral students in educational measurement and/or assessment/accountability policy who want the opportunity to work with the Center’s professionals on projects with direct implications for state and national educational policy.
Note: Given the current Covid-19 context, we plan to offer these internships remotely unless conditions change dramatically between now and June.

The Summer Internship Program
Each intern will work on one major project throughout the summer (to be negotiated between the intern and the Center mentor) and may participate with Center staff on other ongoing projects. The intern will have the opportunity to attend meetings and interact with state assessment personnel. Interns will be expected to produce a written report and a proposal for a research conference (e.g., NCME, AERA), as evidence of successful completion of their project. One of the Center’s senior staff will serve as the intern’s primary mentor, but the interns will interact regularly with many of the Center’s staff. Potential intern projects for 2021 may include the following
1. Improving the interpretability of test score reports. The aesthetics and quality of information presented on test score reports has improved over the last decade, but a survey of state individual student reports conducted by a 2019 Center summer intern (Tanaka, 2019) revealed that error associated with test scores was rarely reported. The Standards for Educational and Psychological Testing (AERA, APA, & NCME, 2014) explicitly call for error or uncertainty of test scores to be included anytime scores are reported. When asked why error is not being reported, many test contractors and state assessment leaders reported that users did not understand how to interpret error and were frustrated trying to make sense of these reports. This internship combines assessment literacy and report design to better understand how we might produce more accurate and useful score reports. This project will involve reviewing assessment literacy research on how best to communicate measurement error, designing report mock-ups, and conducting cognitive laboratories with potential stakeholders to evaluate and refine draft designs.
2. Creating a framework for Opportunity-to-Learn (OTL): Opportunity-to-learn is a more than 50 year-old concept that has evolved from a focus on whether students have had sufficient access to instruction or content linked to particular concepts, to a more robust conception regarding the conditions and resources provided to schools to enable students to succeed. Marion argued for collecting OTL data in 2020-2021 to help contextualize the interpretation of 2020-2021 test scores and because summative assessments, even in the best of conditions, do not provide enough information necessary for policymakers to understand students’ learning context. The intern selected for this project would build on the general framework outlined by Marion to create a detailed set of guidelines for states and school districts describing the types of indicators that should be collected and at what level of the system (e.g., student, district), the types of analyses that should be conducted, including using the OTL data to aid in interpreting student test scores, and how these OTL data should be reported. The intern and mentor will partner with at least one state to analyze actual OTL and use these analyses to help refine the guidance.
3. Investigating the impact of performance assessments on instruction. Performance assessments are considered an authentic measure of student achievement. They are also considered by some educational reformers as an intervention to promote changes to instructional practices in schools. Specifically, to increase the level and complexity of the content that students are asked to learn. We know less about how teachers respond to largescale performance programs than we do about how teachers respond to high-stakes standardized tests. The goal of this project is to better understand how teachers perceive implementing summative classroom performance assessments influences how and what they teach and what students are asked to learn. Project activities will likely include: (1) a review of the literature to contextualize what is known about whether and how teachers change their instructional practices due to the use of summative classroom performance assessments; (2) examination of instruments and techniques used to measure teachers’ instructional changes; and (2) recommendations for approaches states and districts could take to measure the impact of performance assessments on instruction.
4. Modeling student performance on a game based assessment. This internship focuses on the examination of a game-based assessment that is administered over the course of a school year in early literacy and mathematics. This assessment is made up of multiple games aligned to key skills, which are administered over time based on educator needs. This design poses challenges to the construction of an interpretive and supporting validity argument, as well as understanding and modeling student performance over time. Each game is designed to measure a separate early literacy and mathematics skill, and the intended interpretations about students are at the skill level. Therefore, understanding student performance over time is a matter of modeling within-game performance across time for a number of games. The internship project, then, is to first describe performance across time and then model performance across time. Currently, longitudinal item response theory models appear to hold promise for this data, but the interested intern may suggest alternatives based on the initial description of the data. The ultimate aim of the work then is to provide a comprehensive description of student skill performance to inform the ongoing development and use of the games.
5. Emphasizing design in the development of assessment and accountability systems. As states consider a redesign of their current assessment and accountability system(s), they should look to approaches of design used in other industries (technology, pharmaceuticals, manufacturing) for understanding and applying the process for innovation. To date, we have yet to find any studies that synthesize across these various approaches to determine their common and distinct elements of innovation and whether or how they could be applied to the field of education. We contend that a cross-sector analysis of design processes could lead to a more holistic understanding about how innovation happens in developing an assessment and accountability system and could benefit the education research and practice community. To address this gap in the research literature, the goal of this internship is to conduct a crosssector examination of design processes to address three overarching questions: (1) how do different industries design to innovate, (2) what design approaches do they use to implement new products and innovate existing systems, and (3) what can be learned from these cross-sector design approaches to inform assessment and accountability system innovation in K-12 education? 

General Qualifications
The intern must have completed at least two years of doctoral course work in educational measurement, curriculum studies, statistics, research methods, or a related field. Interns with documented previous research experience are preferred. Further, interns must document their ability to work independently to complete a longterm project. We have found that successful interns possess most of the following skills and knowledge (the importance of the level of skills and knowledge in each of the areas described below is dependent on the specific project):
- Ability to work on a team under a rapid development model
- A deep understanding of educational assessment and its uses including policy and practice
- Content knowledge in a relevant discipline (e.g. science, mathematics, language arts)
- Depending on the project, working knowledge of statistical analysis through multivariate analyses as well as fluency with one or more statistical packages, e.g., SAS, SPSS, R
- A solid understanding of research design
- Psychometrics (both classical and IRT) with demonstrated understanding of the principles of reliability and validity
- An interest in applying technical skills and understanding major policy and practical issues
- Excellent written and competent spoken English skills

Logistics
The internship duration is 8 weeks and will be conducted remotely this year unless current health conditions change considerably prior to June. The internship will start in early June 2020; the specific date will be determined by the intern and the mentor.

Support
The Center will provide a stipend of $6000 as well as a housing allowance and reasonable relocation expenses (should the internship be in person).

Application
To apply for the internship program, candidates should submit the following materials electronically:
- A letter of interest explaining why the candidate would be a good fit with the Center, what the candidate hopes to gain from the experience, and which project(s) the candidate’s preferred project. Further, the letter should explain both what the candidate could contribute to the preferred project(s) and why the project(s) fits with the candidate’s interests.
- Curriculum vita, and
- Two letters of recommendations (one must be from the candidate’s academic advisor).

Materials must be submitted electronically (including letters of recommendations) to:
Sandi Chaplin at schaplin@nciea.org and received by February 14, 2021.

National Association of Boards of Pharmacy 

Psychometrician Internship

The summer psychometrician internship is an eight-week, remote program. During the program, the intern will have an opportunity to gain experience in operational psychometric tasks involved in administering and scoring exams and take primary ownership of an applied research project related to psychometrics under the guidance of a psychometrician mentor. Internships typically result in conference presentations (eg, NCME) and sometimes lead to publication or dissertation topics.

Job Description

Interns will help define a research question, review related studies, conduct data analyses (real and/or simulated data), and write a summary report suitable for presentation. Current research projects are summarized below. Applicants are encouraged to identify specific projects they prefer to work on and, if desired, to indicate any other research they would like to conduct.

  • Building an anomaly detection system for identifying aberrant testing behavior
  • Investigating novel optimization techniques for automated test assembly
  • Creating a system to identify enemy items using machine learning
  • Formative assessments using Bayesian Networks or Cognitive Diagnostic Models
  • Novel standard setting methods

Job Requirements

  • Must be a doctoral student in an educational measurement program or related field, who has completed at least two years of coursework.
  • Experience with item response theory preferred.
  • Excellent communication skills.
  • Demonstrates an interest in licensure and certification testing.
  • Proficiency in R and Winsteps preferred.
  • Must be authorized to work in the United States for any employer.

To apply for this position, please email the following information to hr@nabp.pharmacy.

  • Your curriculum vitae with a list of completed graduate courses
  • A statement of purpose describing your interests, including preferred projects and general research interests

Application materials must be received by February 19, 2021, and the internship will be announced in mid-March 2021.

No phone calls, please.

Digital Harbor Foundation 

Virtual Fellowship in Computational Psychometrics

The Duolingo English Test and Digital Harbor Foundation are launching a new fellowship
program to help train the next generation of assessment researchers. The program will
support three (3) fellows during the summer of 2021 (June 16 - August 4). Applications
are open to doctoral candidates authorized to work in the US, and intentional efforts will
be made to attract diverse candidates from minority-serving institutions. Each fellow will
receive a $7,000 stipend from the Digital Harbor Foundation.

REQUIREMENTS
● Completion of two or more years of coursework
● Experience in one or more of the following: test development, IRT, CTT, statistics,
research design, and adaptive testing, programming in R or python
● Working knowledge of statistical software (e.g. R or Python)
● Preferred candidates will also have advanced knowledge of topics such as CAT,
natural language processing, diagnostic modelling, generalizability theory, or
Bayesian theory

ELIGIBILITY
● Active enrollment in doctoral program in measurement, psychometrics, statistics,
cognitive science, machine learning, or related field
● Authorization to work in the US for any employer. If selected, F-1 holders will
need to apply for Curricular Practical Training authorization through their school’s
international student office and have a social security number for payroll purposes

MATERIALS TO SUBMIT
● A current CV
● A cover Letter
● The title of the dissertation and an abstract (not more than 200 words in
single-spaced pages, including references)
● An official letter of recommendation from the applicant's research supervisor

TIMELINE
Applications should be submitted by February 15, 2021 at 11:59 pm EDT. Candidates
will be reviewed jointly by the Duolingo English Test and Digital Harbor Foundation
teams. Award notifications will be sent out in March.
Fellowship dates: June 16, 2021 - August 4, 2021

HOW TO APPLY
● Applicants: Email your application as an attachment to
englishtest-research@duolingo.com with the subject line “[your last name]
Computational Psychometrics Fellowship”
● Supervisors: Email your letter of recommendation as an attachment to
englishtest-research@duolingo.com with the subject line “[applicant’s last name]
Computational Psychometrics Fellowship.” The letter of recommendation must
come directly from the supervisor

Previous award winners of the Duolingo doctoral awards will not be considered.

FOR ANY INQUIRIES
For any inquiries regarding the application process or the grant, please contact us at
englishtest-research+fellowship@duolingo.com. Responses to general questions will be
posted in FAQs on the Duolingo English Test FAQ page.

Curriculum Associates 

The Curriculum Associates Psychometric and Research Summer 2021 Internship

Curriculum Associates is seeking doctoral students for a summer internship focused on psychometric and research projects related to the award-winning i-Ready® Diagnostic, as well as the nationally recognized i-Ready Personalized Instruction system.

About Curriculum Associates
Curriculum Associates is a rapidly growing educational technology and publishing company committed to making classrooms better places for teachers and students. Our assessments provide valuable feedback to teachers and students and are primarily used to place students into individualized instructional paths.

About the Internship
The Curriculum Associates Psychometric and Research Summer Interns will work on both operational psychometric and research projects and will also have the opportunity to get exposure to many aspects of the assessment design and development process including observing a standard setting and a Technical Advisory Committee meeting. Specifically, interns will:
- Analyze the large datasets at Curriculum Associates, which include some of the most extensive assessment and instructional data in the nation.
- Collaborate with other members of the assessment development, psychometrics, and research teams.
- Learn about how an operational computerized adaptive assessment is managed.

Research Project
Interns will focus on at least one major project with the goal of presenting findings at a national conference. Some general topics include the following:

Research:
- Psychometric modeling of the i-Ready Instruction lesson quizzes.
- Develop instructional profiles based on longitudinal usage metrics.
- Examine the relationship between diagnostic duration, number of testing sessions, and student engagement.
- Research related to the COVID-19 pandemic and/or inequities in our educational systems.
- Assist in the development and refinement of a validity argument for i-Ready Instruction.

Psychometrics:
- Investigate IRT calibration requirements in a vertically scaled CAT.
- Investigate and define requirements for a representative equating sample to implement through a continuous field test design.
- Develop data visualization options for IRT and classical item statistics.
- Explore DIF methodologies for large scale assessments in a CAT environment.

Note that the lists above are not exhaustive. Interns may find interest in a project not listed here or may develop a research question of their own.

Qualifications for Interns
Candidates for the internship will ideally have the following qualifications:
- Currently pursuing a doctorate in an educational measurement or related field. A preference will be given to students with at least two years of doctoral-level coursework completed by the start of the internship.
- Must have intermediate proficiency in data manipulation and analysis in SAS and/or R.
- Must have a strong interest in applied psychometrics and research that impacts student learning.

Funding, Length, and Location of Internship
- Interns will work remotely for eight weeks during the summer, ideally from June 14-August 6. Flexibility can be offered in the internship dates if needed.
- Interns will be considered temporary employees of Curriculum Associates.
- Interns work 40 hours/week and will be paid $25 per hour (equivalent to $8,000 for 8 weeks).

Questions and Next Steps
Please provide the following information in your application:
- A cover letter indicating your preference between the psychometric and research track and outlining your current research interests, the reason for your interest in the internship, and the name of your advisor.
- A current resume / CV.
- A letter from your advisor that indicates your advisor is supportive of your involvement in the internship at this point in your graduate work.

Applications will be considered on a rolling basis, with a final deadline of March 1, 2021 at 9:00am ET. Potential interns will be interviewed over video and final selections will be made in mid-March.

Apply here: https://jobs.jobvite.com/careers/curriculumassociates/job/ojovefwa?__jvst=Employee%20Referral&__jvsd=seH0Lhws&__jvsc=email&bid=nkjuFSw9

Edmentum

RLE Psychometrics Internship


Description

Edmentum is committed to making it easier for educators to individualize learning for every student through simple technology, high-quality content, actionable data, and customer success. Founded in innovation, Edmentum’s powerful learning solutions blend technology with individual teaching approaches. We are dedicated to being educators' most trusted partner in creating successful student outcomes everywhere learning occurs. Our commitment is built off the emphasis we place on our core values: passion, people, respect, collaboration, and performance.

At Edmentum, we are committed to building research-based curriculum and assessment solutions that empower educators and improve student outcomes across the globe. The Edmentum Research and Learning Engineering Summer Internship - Psychometrics is a paid internship opportunity for curious and collaborative graduate students that are interested in pursuing a career as a psychometrician and are deeply interested in exploring educational solutions that truly meet the needs of learners and educators.

For the summer 2021 internship, the psychometrics intern will collaborate with Edmentum Research Scientists and Learning Engineers to conduct operational psychometric analyses to examine the reliability and validity of Edmentum’s Exact Path computer adaptive diagnostic assessment for students in K-12. The intern will gain experience in applied psychometric analyses, computer adaptive testing, educational technology, processes and tools used in the psychometric and educational technology industries, and working with large data files (>10 million records).

With support from Edmentum Research Scientists and Learning Engineers, the psychometrics intern will:

- Modify existing analysis code and write new code to prepare data and perform psychometric analyses
- Interpret analysis findings and provide written descriptions to update the Exact Path technical report
- Explore and recommend tools for modernizing the format of the Exact Path technical report (e.g., by shifting away from Word documents and PDFs)
- Create internal dashboards with real-time monitoring of the psychometric quality of the Exact Path assessment

Edmentum is also interested in supporting the intern’s unique research interests for projects that can be completed with existing or simulated data and have promise to lead to conference presentations and/or publications while simultaneously improving Edmentum products. Although not required, interns with particular research questions of interest related to K-12 computer-adaptive testing are encouraged to propose research ideas in their cover letter.

Preferred Qualifications:
- Enrolled in a graduate program in educational measurement, psychometrics, statistics, learning sciences, or a related field
- Have completed at least one year of graduate-level statistics coursework
- Have completed at least one introduction to measurement / psychometrics course
- Proficient with basic R programming skills (data preparation, descriptive statistics, data visualizations) with desire to learn more advanced skills
- Excellent writing skills
- Self-starter that can work independently (with appropriate support provided from Edmentum Research Scientists)

This position is remote and will last 8 to 12 weeks depending on the intern’s availability. Due to frequent collaboration with Research and Learning Engineering team members, working hours will follow typical central time business hours with some flexibility as needed.

Edmentum is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender identity, sexual orientation, national origin, disability status, protected veteran status, or any other characteristic protected by law.

To apply: https://recruiting.ultipro.com/PLA1009PLATO/JobBoard/04e3a767-9600-42c4-86c9-5912bbf18813/OpportunityDetail?opportunityId=e8fdec82-72ad-4057-9ede-05100c5c9d6f

Duolingo English Test/Digital Harbor Foundation Fellowship 

Duolingo English Test/Digital Harbor Foundation Fellowship

Description
The Duolingo English Test and Digital Harbor Foundation are launching a new fellowship program to help train the next generation of assessment researchers. The program will support three (3) fellows during the summer of 2021 (June 16 - August 4). Applications are open to doctoral candidates authorized to work in the US, and intentional efforts will be made to attract diverse candidates from minority-serving institutions. Each fellow will receive a $7,000 stipend from the Digital Harbor Foundation.

Click here for full details on requirements, eligibility, timeline and instructions to apply.

ETS Harold Gulliksen Psychometric Research Fellowship 

ETS Harold Gulliksen Psychometric Research Fellowship

Description

The fellowship provides funding for promising graduate students in psychometrics, or a related field, who are conducting innovative, applied research. Selected fellows will carry out their proposed research under the mentorship of their academic advisor and in consultation with their ETS mentors over the course of the funding. The fellowship begins with the selected fellows working on their research as part of the Summer RMS Internship Program, where they will be provided with guidance from ETS’s leading researchers. During the subsequent academic year, fellows study at their universities and carry out research under the supervision of an academic mentor and in consultation with their ETS mentor.

Program Goals

The goal of this program is to increase the number of well-trained scientists in educational assessment, psychometrics and statistics.

Important Dates

    • January 22, 2021 — Deadline for receipt of preliminary application materials
    • February 5, 2021 — Applicants are notified of preliminary application decision
    • March 5, 2021 — Deadline for receipt of final application materials
    • March 31, 2021 — Award recipients are notified

Award Duration

Appointments are for one year.


Award Value

Each fellow's university receives the following:

  • $20,000 to pay a stipend to the fellow
  • $8,000 to defray the fellow's tuition, fees and work-study program commitments
  • A small grant to facilitate work on the fellow's research project

Selected fellow must participate in the Summer RMS Internship Program in Research for Graduate Students.

Eligibility

At the time of application, candidates must be enrolled in a doctoral program, have completed all the coursework toward the doctorate, and be at the dissertation stage of their program. Dissertation topics in the areas of psychometrics, statistics, educational measurement or quantitative methods will be given priority. At the time of application, candidates will be asked to provide a statement describing any additional financial assistance such as assistantship or grant commitment that he/she will have during the fellowship period.

Selection

Selection is based on:

  • Strength of the applicant's academic credentials. Applicants need to demonstrate superior academic ability and achievement, as well as exceptional promise in the field of measurement, psychometrics or statistics.
  • Suitability and the technical strength of the proposed research project. The project must relate to research currently under way at ETS. The preferred arrangement is that the proposed project be the applicant's doctoral thesis and that the applicant is not receiving alternative funding for this work. Non-dissertation projects may be considered provided that the applicant is doing significant independent work.

Application Procedures

Submit all application materials via email with PDF attachments.

Preliminary Application

  • Letter of interest describing the research that would be undertaken during the award year and how the research fits with ETS research efforts
  • Statement describing any additional financial assistance, such as assistantship or grant commitment, that you would have during the fellowship period
  • Nomination letter (either as an email or as an email with a PDF attachment) from an academic advisor in support of your interest in the fellowship award
  • Current curriculum vitae

Final Application

If your preliminary application is approved, you will be invited to submit the following materials:

  • Detailed project description (approximately 15 double-spaced pages) of the research the individual will carry out at the host university, including the purpose, goals and methods of the research
  • Graduate academic transcripts (unofficial copies are acceptable)
  • Evidence of scholarship (presentations, manuscripts, etc.)

Contact

For more information, contact us via email.

Please find the announcement here:

https://www.ets.org/research/internship-fellowship/gulliksen