Prep list when you attend and present in the conference

To bring list when you have your presentation in the conference.

  • pointer
  • name cards
  • In case of AECT, name badge holder (I have three from previous years)
  • Display port
Advertisements

Notes from the AECT 2015

At Indianapolis, Hyatt Regency Hotel.

Nov, 6, 2015

  • My presentation, Collaboration study: talked with Dr. Brill after my presentation, she submitted a paper about ID students’ learning from peer-reviewing. Among future research topics, audiences are interested in: (1)  why both higher and lower scorers outperformed when they were in a lower performing group than in higher performing group.
  • Engagement in PBL: got a contact from a middle school Spanish teacher at Fairfax, VA. Her students are mostly Koreans and the school principle is very opened to do some studies in the school She said, she will contact me to talk further.
    • Suggestions on the engagement survey
      • for the behavioral engagement survey item, asked put more time than other projects; providing scales; for example -50%– -25%—0%–+25%–50%
      • review the survey and revise the items, if it asked positive responses to students. Make every item neutral.

Nov, 5, 2015

get the consultation about How to keep developing and expanding research agenda and paper from 5+ years experienced faculty.

 

Nov, 4, 2015 

#5 case-based MOOC (Won)

  • CBL eLearning designs to address ethical dilemmas in genomics and also lead to satisfaction with learning
  • Project(grant) goal: “develop a university level, case-based, online course on Ethics in Genomics for broad dissemination through a MOOC”
  • they hired a videographer, used WebEx
  • Micro-collaboration model?Aleckson and ? <–ask Won
  • Research questions: continuous formative assessment
  • Richey & Klein, 2007, design and development research.
  • Aleckson & Ralston-Berg (2011), Mindmeld: Micro-collaboration between eLearning designers and instructor experts.
  • rubric development? ; who developed and how? and who evaluated?–> Rubric sample
  • To track student learning achievements, they checked their content knowledge
  • they analyzed students satisfaction with open-ended survey) –> only?
  • Results
    • students need more options of cases, giving them options for completion pace–> for LCC MOOC, I think it would be great to give completion pace. why not?
    • students prefer discussion, but break discussions with small numbers of students
    • hated essay type format

#4 MOOCocracy (Jaimie Loizzo)

  • checked motivation, success, and completion
  • adult leaners social science MOOC experience (instructor, social, cognitive presences; with community of inquiry–> I think this is one of the thing that I can do for LCC MOOC
  • narrative analysis: asked students about the “meaning of success in MOOC course” for them?
  • Barriers: time and course design (dislike videos, discussions, etc)

#3 Detecting at-risk students in asynchronous online discussion

  • “at-risk students” is not an appropriate term…what else?; lower performing students, potential
  • used proxy variables: when not actual var is not measurable, use components of actual var.
  • used Random Forest: a prediction method with a relatively small sample and large number of predictors (Bureau et al., 2005)
  • active learners or non-active learners will predict final learning outcomes?
  • Implication: pre-diagnose to provide appropriate or personalized intervention/scaffolding

#2 Relation between visual attention and learning analytics dashboard comprehension

  • with eye tracking system, Begaze
  • learners graph literacy affect use of LAD

#1 Learning Analytics Dashboard (Dongho)

  • seven variables that will affect behavioral engagement
  • set hypotheses and ran SEM
  • Findings Implications: learners need to understand the usage of LAD and facilitate to do reflective use
  • operational definition of LAD (in Edx it is called as progress check)