Internships & Fellowships



         
          The following information is maintained by the The Graduate Student Issues Committee (GSIC). 

          If you would like to have your internship or fellowship listed here please contact our GSIC co-chair:
          Delwin Carter: delwincarter@ucsb.edu
   


Graduate Student Internships

Internships are a valuable way to link your academic experience with the professional arena. Below is a list of internships that will allow students to go beyond the classroom and conduct practical research with a mentor from a testing company or research agency.


Graduate Student Fellowships

Fellowships provide structured work experience and professional development that include intensive training and experiential learning. Below is a list of fellowships that provide support to the fellow's growth and opportunities to explore a particular field of measurement.

Alpine Testing Solutions Inc.

Summer Internship 2020

Alpine Testing Solutions is seeking an advanced graduate student to work with our psychometric team for an 8-week internship during the summer of 2020.

About Alpine Testing Solutions, Inc.

Alpine is a unique and innovative company as compared to other companies in the industry for several reasons:

  • Alpine’s focus is on providing custom psychometric solutions to our clients rather than applying a standard solution. This allows our team to be innovative in the solutions we provide our clients.
  • As a relatively small company with 54 employees, we do not have the organizational complexities experienced in a large corporation; however, 20% of Alpine’s staff are Ph.D.-level psychometricians, so we have a great deal of expertise and depth of knowledge on staff benefiting both our clients as well as our fellow team members.
  • Our psychometric team provides services across a wide range of industries including professional credentialing, IT credentialing, and education.
  • We are completely virtual meaning that all employees work from home.
  • We are 100% employee owned so staff are literally invested in the company.

Internship Details

The successful candidate will work virtually with a mentor on the psychometric team on a focused research project for the duration of the 8 weeks. The work will primarily occur in June and July; however, the exact dates can be determined in collaboration with the successful candidate. The position requires 40 hours a week, during business hours, for the internship duration. Total compensation for this opportunity is $6,400.

The research project will focus on innovations to the job task/domain analysis and blueprinting process. The successful candidate will work with their assigned mentor to negotiate the specific research goals and related work for the research project. Throughout the summer, the intern will be expected to meet regularly with their mentor to provide progress updates and work collaboratively on the project. At the conclusion of the internship, ideally the project will result in conference presentation proposals and/or journal submissions. 

Qualifications

  • Two years of completed coursework and active enrollment in a Ph.D. program in educational measurement, psychometrics, I/O psychology, or related field.
  • Exceptional critical thinking skills
  • Self-motivated and able to work independently
  • Excellent written and verbal communication skills
  • Experience with IRT analysis software such as Winsteps and IRT Pro

Applications

To apply for this opportunity, please submit a curriculum vitae as well as a letter of interest to jobs@alpinetesting.com by March 31, 2020.

National Board for Professional Teaching Standards

2020 Summer Assessment Research Internship

The National Board for Professional Teaching Standards’ mission is to advance student learning and accomplishment by advancing the quality of teaching and learning through a voluntary advanced certification.

Job Summary:

The National Board for Professional Teaching Standards is seeking a well-qualified individual for an internship opportunity to gain experience in assessment research. The intern will work with the National Board’s psychometrician to determine which research projects to pursue and will assist with research in support of the National Board’s test development processes and psychometric methodologies. The Summer Assessment Research Internship will start on or around June 4, 2020; the specific date will be determined by the intern and the mentor and last approximately two months.

Summary of Key Responsibilities:

The intern and mentor will work together to define a research question; review related research; conduct data analyses; and write a report. The final research project will be determined based on a combination of intern interest and research importance. Research projects are summarized below.

Evidence of Content Validity

The National Board has data to be analyzed to strengthen evidence of its assessment’s content validity.  This research project will involve analyzing data on how well the Component 1: Content Knowledge items align with the standards or content specifications and preparing a report on the findings. 

Content Validation Study

Design of a content validation study is underway for the National Board’s new assessment of Maintenance of Certification (MOC). By carrying out a content validation study, the National Board wishes to establish how National Board Certified Teachers, i.e., SMEs in the area of accomplished teaching, view the tasks required of MOC in terms representativeness and relevancy to the standards and Five Core Propositions that underpin Board certification. The study is expected to take place in summer 2020. This study will involve facilitating an online study and/or analyzing data on the fly, along with writing up results of the study.

Timing Analyses

How much time do candidates use on the selected- and constructed response items of Component 1: Content Knowledge? Are there differences in testing time or behavior across race/ethnicity and sex? This research project will involve reviewing previously conducted research, knowledge of testing time analyses, conducting analyses on response and completion times for the entire set of Component 1 forms for all certificate areas, and preparing a report of the research.

Longitudinal Research on Item Performance

As part of the National Board’s initiative to monitor the health of its item bank, the intern will examine the temporal stability of test items over a course of several years.

Summary of Experience and Required Competencies (Knowledge, Skills and Abilities):

  • Graduate student from an accredited university preferably in educational measurement, curriculum studies, statistics, research methods, or a related field ; quantitative psychology or education measurement with an emphasis on psychometrics and statistical data analysis; completion of two or more years of graduate coursework.
  • Intern will be assigned to one mentor, but must be able to work independently and exercise sound judgment.
  • Competency in the use of and his/her own access to statistical software.
  • Ability to manage large data sets.
  • Strong research, analytical, writing, and communication skills.
  • Must be authorized to work in the US for any employer. If selected, F-1 holders will need to apply for Curricular Practical Training authorization through their school's international student office, and have a social security number for payroll purposes.

Position Type and expected hours of work:

  • This is a Part Time Temporary position for a Time Limited Assignment beginning on or around June 4, 2020 and lasting up to two months.

Location:

  • Arlington, VA or Remote Telework

Compensation:

  • Total compensation for the two months is approximately $4000.

To Apply:

Please email the following by April 10, 2020 to: intern@nbpts.org 

  1. A cover letter including your research area(s) of interest, what you hope to achieve from the internship, what you can contribute to the organization, and evidence of your ability to work independently on a research project,
  2. your current resume or curriculum vitae, and
  3. two letters of recommendations (one must be from your academic advisor).

All applicants will be notified of selection decisions by April 27, 2020.

AAP/EEO Statement:

The National Board provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training.

 

 

The College Board 

The College Board Psychometric Intern - Summer 2020

Named by Fast Company as one of the most innovative education companies, the College Board is a mission-focused organization, and powerful force in the lives of American students. To fulfill our purpose of clearing a path for students to own their future, we offer access, opportunity, and excellence to millions of students each year. Over the past five years, the College Board has been undergoing a transformation. We’ve redesigned the SAT, PSAT/NMSQT, and many AP courses and exams, and we’ve introduced the PSAT 10, PSAT 8/9, and Official SAT Practice on Khan Academy, all with great success. 

The Psychometrics Department is looking for two doctoral summer interns for 2020 and each intern will work with two mentors on a specific project in the area of psychometrics. 

The internship spans 8 weeks, starting on June 8th and ending on July 31st, with an expected weekly full-time workload (40 hours per week). This eight-week internship is designed to provide interns with opportunity to work closely with psychometricians and gain hands-on working experience with College Board data and projects. Interns are expected to perform a literature review, conduct analysis, write a research report, and present the research to College Board staff at the conclusion of the project. 

To be eligible: 

Interns must be full time doctoral students at an accredited 4‐year university 

A strong preference will be made toward advanced students in the process of completing their dissertations 

Graduate students in psychometrics, measurement, quantitative & mathematical psychology, educational psychology, industrial‐organizational psychology, statistics or related fields are invited to submit applications 

Experience with statistical software (SAS, SPSS, and/or R) is required and working knowledge of Classical Test Theory, and Item Response Theory are desired 

Interns are expected to perform a literature review, conduct analysis, write a research report, and present the research to College Board staff at the conclusion of the project 

Students must be eligible to be legally employed in the United States (international F1 visa students please read details below) 

o Internship timing: Eight weeks (June 8–July 31, 2020) require prior clearance from the College Board before the internship begins. 

o Positions: 2 

o Hours per week: 40 hours per week 

o Location: Newtown, PA office (Yardley, PA) 

o This is a paid internship 

o Housing stipend can be offered when deemed necessary 

o Application deadline by February 28, 2020. Applicants will be informed about acceptances by March 13, 2020 

o Please indicate in your Cover Letter which project you would like to be considered for. 

Interns will work with two College Board mentors on a specific project in the area of psychometrics. Possible topics include: 

Project 1: Investigation of Methods to Evaluate Item Fit Plot 

In an operational testing environment, it is imperative that psychometric work be completed in a timely manner. One task that can be time consuming is the evaluation of item characteristic curve plots for fit. The proposed study is to evaluate variants of a method to evaluate item fit plots, with the goal of alleviating the need for human review. The study will utilize operational data and requires some advanced programming in R and familiarity with FlexMIRT. Familiarity with aberrancy detection methods is a plus. 

The goal of this project is to investigate an automated method to determine if the empirical data in an item plot fits the item characteristic curve (ICC). If it shows sufficient effectiveness, then the process developed could be used to reduce the amount of time and resources spent reviewing IRT item plots for pretest items. 

The basic procedure involves the following steps: 

1. The calibration data is split into two datasets. The procedure for splitting the data is described below. 

2. Two sets of item parameters are calibrated 

3. The ICCs of the items are compared to their corresponding ICCs 

4. Items with large differences between their ICCs are then removed 

5. The procedure is then repeated until no items are flagged for removal 

6. The excluded items are tagged as having poor item fit 

The crux of this study is to identify a procedure that splits the data in a manner that identifies items with poor item fit. A few of the options being considered are: 

1. Institution size 

2. Lz statistic 

3. Randomly 

4. Admin Region 

The final list of options will be identified in collaboration with the intern. We plan to use SAT data from a pretest administration for the analysis. The results will be compared to the list of pretest items flagged under the current review process. 

Project 2: Impact of Population Variation on Equating Error and Scale Stability 

Testing programs usually offer more than one administration throughout a year, which may lead to test-taker population variation across those administrations. Meanwhile, equating results based on different populations may vary and it is important to evaluate the impact of population variation on equating to guide and inform operational equating practice. 

This research study will evaluate whether equating invariance holds for tests administered to test-takers with various characteristics. Data will be simulated in the IRT framework and evaluated by both classical and IRT equating methods. 

The primary research questions are (1) to evaluate whether equating based on test-takers from different administrations yield similar results, and (2) to identify robust equating methods that provide stable equating solutions. This study will also investigate the impact of population variation on equating based on different kinds of test forms (e.g., with essay vs. without essay, easy vs. hard, and high reliability vs. low reliability, etc.). 

Skills Required: 

• Equating 

• Item Response theory 

• Strong programming skills in C++, SAS, or R 

To apply: https://careers.collegeboard.org/job/2010168 

International students who are studying at an accredited university under F1 visas are eligible to apply for the summer internship under Curricular Practical Training (CPT) stipulations. Please note that only two internship positions can be offered. International students should not apply for CPT unless accepted as a summer intern. 

Upon acceptance to the summer internship, we urge students to contact their respective international advisers at their host university as soon as possible to apply for a practical training certificate, which permits F1 visa holders to receive compensation from the College Board for the work they will be completing over the summer. The process to clear a student for CPT may take six weeks or longer. Therefore, we urge students to initiate the process as soon as possible. Additionally, all international students must have a social security number in order to receive compensation. 

 

ETS Post Doctoral Fellowship 

ETS Post Doctoral Fellowship

Description

Individuals who have earned their doctoral degree within the last three years are invited to apply for a rewarding fellowship experience which combines working on cutting-edge ETS research projects and conducting independent research that is relevant to ETS's goals. The fellowship is carried out in the ETS offices in Princeton, N.J. This year we are seeking applicants with experience in the following areas:

  • Applied Psychometrics
  • Artificial Intelligence Based Automated Scoring
  • Modeling and Scoring Item Responses from Interactive and Simulation-Based Assessments
  • Modeling of Response Processes and Response Times
  • Psychometric Issues in Adaptive Testing Designs
  • Statistical and Psychometric Foundations
  • Statistical and Psychometric Issues in Group-Scored Assessments
Program Goals
  • Provide research opportunities to individuals who hold a doctorate in the fields indicated above
  • Enhance the diversity and inclusion among underrepresented groups in conducting research in educational assessment and related fields
Important Dates
  • March 1, 2020 — deadline for preliminary application
  • April 15, 2020 — deadline for final application materials
Duration of Program

The fellowship is for a period of up to two years, renewable after the first year by mutual agreement.
Compensation

  • Competitive salary
  • $5,000 one-time relocation incentive for round-trip relocation expenses
  • Employee benefits, vacation, holidays and other paid leave in accordance with ETS policies
Eligibility
  • Doctorate in a relevant discipline within the past three years
  • Evidence of prior independent research

 For more information please visit the ETS Post Doctoral Fellowship announcement.