NSF Learning Analytics Workshop

By |

This NSF workshop brings together learning and data scientists with various backgrounds and prior expertise to collaboratively solve the research challenges of development of instruction, assessment of competence of current and would-be workers, and evaluation of learning tools.  Three specific questions will be addressed: how to define competence, measure competence and evaluate new approaches to learning.  Speakers are invited from across industry and academia to ensure a broad perspective and specifically to take into account employer’s perspectives.  Please join us for an exciting event and lively discussions.

Please register if you would like to attend.

Schedule:
March 18, 2019
8:00 a.m. – Registration
8:30 a.m.  – Welcome and Introductions

  • Stephanie Teasley, Research Professor, School of Information, University of Michigan
  • Rada Mihalcea, Professor, Computer Science and Engineering, University of Michigan
  • Henry Kelly, Senior Scientist, Michigan Institute for Data Science, University of Michigan

8:45 a.m. – Talks and discussion on defining competence

Marie Cini

President and CEO

The Council for Adult and Experiential Learning

 

 

David Blake

CEO

Degreed

 

 

9:40 a.m. – Talks and discussion on measuring competence

Bror Saxberg

Vice President

Learning Science at Chan Zuckerberg Initiative

 

 

Tammy Wang

Vice President

Data Science and Analytics at Riviera Partners

 

 

10:35 a.m. – Talks and discussion on evaluating new approaches to learning

Norman Bier

Director

Open Learning Initiative, DataLab, Carnegie Mellon University

 

 

Yun Jin Rho

Director

Efficacy Analytics and Studies, Pearson

 

 

11:30 a.m. to 12:00 p.m. – Networking

March 19, 2019

9:45 a.m. to 12:50 p.m. –  Panel discussions on each of the three topics
12:50 p.m. – Concluding remarks and discussion of next steps
1:00 p.m. – Adjourn

Study on bias in learning analytics earns Brooks Best Full Research Paper Award at LAK conference

By | General Interest, Happenings, News, Research

A paper co-authored by University of Michigan School of Information research assistant professor Christopher Brooks received the Best Full Research Paper Award at the International Conference on Learning Analytics & Knowledge (LAK) Conference in Tempe, Arizona. The award was announced on the final day of the conference, March 7, 2019.

The paper, “Evaluating the Fairness of Predictive Student Models Through Slicing Analysis,” describes a tool designed to test the bias in algorithms used to predict student success.

The goal of the paper, Brooks says, was to evaluate whether the algorithms used to predict whether students would succeed in massive online courses (MOOCs) was skewed by the gender makeup of the classes.

“We were able to find that some have more bias than others do,” says Brooks. “First we were able to show that different MOOCs tend to have different bias in gender representation inside of the MOOCs.”

Read more…

AIM Analytics Seminar – Dan Davis, PhD Candidate, TU Delft, the Netherlands.

By |

Improving Online Learning Outcomes Using Large-Scale Learning Analytics

Abstract: This talk will cover a holistic approach to improving learning outcomes and behavior in large-scale learning environments—namely MOOCs. I begin by sharing the results of a study exploring the extent to which learners follow (or deviate from) the designed learning path and the impact this behavior has on eventual learning outcomes. We next take a deeper dive into the design of online courses with a large-scale learning design approach, where I’ll present an automated method developed to categorize courses based on their design. With these trends in learning & teaching behavior in mind, the talk will conclude with the results of a series of randomized experiments (A/B tests) carried out in live MOOCs designed to provide additional support to learners. From these experiments we arrive at a better understanding of which instructional & design strategies are most effective for improving learning outcomes and behavior at scale.

Bio: Dan’s research uses and advances learning analytics techniques in open, online education at scale by pushing the boundaries towards personalized & adaptive learning environments. Dan develops methods to gain a deeper understanding about how the design of online learning environments affects learner success and engagement, often by implementing and testing instructional interventions at scale using randomized controlled experiments. Dan earned his BA in English, Writing & Mass Communication with a minor in Graphic Design from Assumption College in Worcester, Mass. His MA is from Georgetown University in Communication, Culture & Technology, and he is currently finishing his PhD in Computer Science, Learning Analytics from TU Delft in the Netherlands.

Lunch will be provided.

AIM Analytics is a bi-weekly seminar series for researchers across U-M who are interested in learning analytics. The field of learning analytics is a multi and interdisciplinary field that brings together researchers from education, learning sciences, computational sciences and statistics, and all discipline-specific forms of educational inquiry.

MIDAS Learning Analytics Challenge Symposium

By |

Learning analytics is one of the research focus areas that MIDAS supports with its Challenge Awards.  Our long-term goal is to support this research area more broadly, using the Challenge Award projects as the starting point to build a critical mass.  This symposium offers a platform for all participants to explore collaboration opportunities and aims to attract more researchers to our hub.  It will feature in-depth presentations from two Challenge Award teams, and all participants are encouraged to submit posters on research related to Learning Analytics.

Agenda

9 am to 11:30 am: Welcome and Challenge Award presentations

11:30 am to 1 pm: Lunch, Poster Session, Networking [poster dimensions: up to 6ft wide X 4ft height]

1 to 2 pm: Panel discussion: The Future of Data Science for Learning Analytics at U-M

Panelists:

  • Steve DesJardins, Education, Public Policy
  • Cynthia Finelli, Engineering Education Research Program
  • Al Hero (Moderator), MIDAS, Electrical Engineering and Computer Science
  • Rada Mihalcea, Computer Science Engineering
  • Stephanie Teasley, Information

 

Please register online.  Please submit poster abstracts (< 300 words).  Submission Deadline: May 15.

For questions: midas-research@umich.edu.

Recommended Visitor Parking:  Palmer Parking StructurePalmer Drive, Ann Arbor

IOE 899 Seminar Series: Stanley Hamstra, PhD, Milestones Research & Evaluation Accreditation Council for Graduate Medical Education

By |

Stanley J. Hamstra, PhD

VP, Milestones Research and Evaluation Accreditation Council for Graduate Medical Education

 

“Learning Analytics in Graduate Medical Education: Realizing the Promise of CBME with Milestones Achievement Data”

Abstract: In 2012, the Accreditation Council for Graduate Medical Education (ACGME) introduced the Next Accreditation System (NAS) for improving postgraduate medical education. An important component of the NAS is a shift towards competency-based medical education (CBME), involving milestones as markers of achievement during training. Since 2015, the ACGME has been collecting milestones achievement data (competency ratings) on all resident and fellow physicians in accredited training programs in the USA (n > 110,000 residents and fellows per year). A critical assumption in CBME is that assessment data regarding any learner (in any form) contains some degree of uncertainty. At the same time, program directors must make finite/binary decisions about learners at the time of graduation, and indeed throughout training. The availability of milestones data, in the context of national trends, gives the program director an additional tool for making the best decisions regarding learner progression (and ultimately graduation). I will briefly review tools we have developed to help program directors make use of milestones data to enhance the quality of their decisions regarding resident progression and graduation. In addition, I will outline an approach to using the data for enhancing national curricula within a specialty.

Bio: Dr. Hamstra is responsible for oversight and leadership regarding research in Milestones and assessment systems that inform decisions around resident physician progression and board eligibility. Dr. Hamstra works with medical subspecialty societies, program director organizations, the American Board of Medical Specialties, and specialty certification boards. His research addresses medical education broadly, including competency assessment for residency training programs, and developing administrative support for educational scholarship within academic health settings. Prior to joining the ACGME, Dr. Hamstra was at the University of Michigan, the University of Ottawa, and the University of Toronto Department of Surgery. He has also worked closely with the Royal College of Physicians and Surgeons of Canada on developing policies regarding competency-based medical education for graduate medical education. Dr. Hamstra received his PhD in sensory neuroscience from York University in Toronto in 1994.

Learning Analytics Summer Institute, University of Michigan, Ann Arbor, MI

By |

 

The 2017 Learning Analytics Summer Institute (LASI) will be hosted by the University of Michigan at Ann Arbor. LASI17 will be not just one big gathering in one place, but a network of online and face-to-face events. This year LASI participants will be able to choose two tutorials and one immersive workshop. The program schedule is located here.

LASI17 is open for members of the Society for Learning Analytics Research (SoLAR) only, so please join or renew your SoLAR membership prior to registering for LASI17. Click here to become a SoLAR Member to attend, as well as receive other year-round benefits.

There is support for students. The student scholarship application deadline: May 31, 2017 at 5 pm (EST).

Please visit our video archive or our official Youtube channel for more details about previous events.

MIDAS awards first round of challenge funding in transportation and learning analytics

By | General Interest, Happenings, News

Four research projects — two each in transportation and learning analytics — have been awarded funding in the first round of the Michigan Institute for Data Science Challenge Initiatives program.

The projects will each receive $1.25 million dollars from MIDAS as part of the Data Science Initiative announced in fall 2015.

U-M Dearborn also will contribute $120,000 to each of the two transportation-related projects.

The goal of the multiyear MIDAS Challenge Initiatives program is to foster data science projects that have the potential to prompt new partnerships between U-M, federal research agencies and industry. The challenges are focused on four areas: transportation, learning analytics, social science and health science.