CDEM Voice – Research Column

er-column

Using Surveys in Medical Education Research

Surveys are a common means of acquiring data in medical education research. They can be used to perform needs assessments for curriculum design, evaluate learner and instructor perceptions, or gather information about the attitudes and preferences of individuals in the educational environment.

Surveys are best used to obtain subjective data, such as opinions or attitudes, as these cannot be observed directly. Directly observable data may be gained by various means, such as direct measurement (e.g. using an exam to test content retention), database review (e.g. analyzing student academic records), or third-party observation (e.g. outsourcing analysis of videotaped procedural performance). In contrast, surveys ask respondents to report or recall information, and as such their findings risk being subject to various cognitive biases such as recall bias. For this reason, data should only be obtained by a survey if they are otherwise unavailable due to the impracticality or impossibility of more direct assessment.

When designed and applied properly, surveys can achieve a scientific rigor comparable to psychometric tools. If relevant to the researcher’s questions, previously validated survey tools should be used, as these are more likely to yield high-quality responses and allow for comparison against pre-existing data. Many questions in medical education research may require the construction of new surveys, however, as pre-existing survey questions may not fully capture the information relevant to the study at hand.

Artino et al. describe seven steps in developing a survey tool to maximize its scientific rigor: (1) conduct a literature review to look for pre-existing survey tools or items, and to align the new survey with existing literature; (2) conduct interviews or focus groups to help understand how the population of interest conceptualizes the relevant questions; (3) synthesize the first two steps to ensure that the ideas under investigation a) use language that the population of interest can understand, and b) that these ideas make theoretical sense based on the literature; (4) develop survey items; (5) conduct expert review of these items to assess for content validity; (6) vet these items with members of the respondent population to ensure understanding in the manner that the investigators intend; and (7) run a pilot survey to evaluate response variance and conduct a formal analysis of the content validity of each survey item. While the performance of each of these steps may be too cumbersome for the construction of a given survey, incorporating as many as possible will increase the quality and reproducibility of survey responses and will make the data more reliable for comparison against future studies.

Furthermore, careful design facilitates the reporting of many important survey considerations. A robust evaluation of the study population will enable clearer description of sampling criteria, which will add to a survey’s reproducibility and indicate the generalizability of its results. Content review and item vetting not only improve the quality of data collected by individual questions, but also help identify the best means of response measurement. For example, an opinion question might utilize a Likert scale or free responses, and these two types of data demand very different analyses. Identifying which response type is best suited to gathering the data of interest is crucial for both data acquisition and reporting.

Finally, carefully crafted survey items avoid ambiguity or confusing language. This helps reduce erroneous responses, and thus makes inaccurate conclusions less likely. Clear wording can also improve the response rate, which is a crucial piece of information for any survey study to report. As in all study types, achieving appropriate power helps reduce the risk of type I and type II errors, and in survey methodology power is reflected by the response rate as well as the sample size.

Surveys can be excellent tools for elucidating subjective or otherwise non-observable data. With careful design and appropriate application, they can attain a high degree of scientific rigor.check box

 

Andrew Ketterer, MD

Medical Education Fellow

Beth Israel Deaconess Medical Center/Harvard Medical School

References:

Artino AR, La Rochelle JS, Dezee KJ, et al. Developing questionnaires for educational research. AMEE Guide No. 87. 2014(36):463-74.

Phillips AW. Proper applications for surveys as a study methodology. Brief Educational Advances. December 5, 2016. 18(1).

 

CDEM Voice – FOAMonthly

foamonthly-title

A Heaping Spoonful of FOAM Literature Reviews


Site: https://journalfeed.org

Are you tired of hearing about “that cool new study” that your student heard about on a podcast, but can’t cite? Do you find the “why do we do this?” question difficult to answer on shift? Looking to direct your students to landmark EM papers that guide our practice?  This month’s FOAMonthly highlights JournalFeed, formerly EM Topics, which promises to spoon feed us the most important EM literature.

Clay Smith, MD (@spoonfedEM), created JournalFeed to meet the broad needs of novice to advanced learners who need some help staying informed.  Via an almost continuously updating website, daily emails and weekly summary emails, Dr. Smith uses the “spaced repetition” model of learning. After he combs through all of the new EM literature in a month (>1500 articles from 31 journals), Dr. Smith selects articles that seem the most relevant to an EM physician’s practice and posts a summary/emails one reviewed article per day. While each review does link to the primary source, there is also a summary of the highlights, including a single sentence “Spoon Feed” summary, a “why does this matter” section and a brief discussion of the findings and methods of each article.  The website is open access, and daily/weekly emails are via a free subscription.

For our most novice learners, the “Landmark Papers” section will be the most informative. From articles on vent management, stroke care, antibiotic stewardship, to health policy and ED operations, there’s something for every interest.  Best of all, summaries are short, concise and relevant to daily practice.

 

Happy Spoonfeeding,

 

Emily Brumfield MD

@DrSadieHawkins

Assistant Professor

Assistant Director of Undergraduate Medical Education

Vanderbilt University Medical Center

CDEM Voice – Member Highlight

capture


Lexington

David Story MD, FACEP

B.S., Biochemistry, Louisiana State University & A&M College, 2002

M.D., Louisiana State University Health Sciences Center in New Orleans, 2006

Residency, Emergency Medicine, Duke University Hospital, 2009

Fellowship, Medical Toxicology, NYU/Bellevue/NYC-Poison Control Center, 2010

Physician, Emergency Department, Lower Manhattan Hospital (formerly New York Downtown Hospital), 2009-2011

Assistant Professor, Department of Emergency Medicine, Wake Forest University Medical Center, 2011-present


 

1. What is your most memorable moment of teaching?
There isn’t a specific moment, but I love seeing that point at which a certain idea “clicks” with a student and their eyes light up with the sudden understanding of something that was previously unknown.

blank
2. Who or what is your biggest influence?
I have been blessed to have been surrounded by several outstanding clinician educators during my training and practice years. Physicians who not only have an extraordinary knowledge base, but are complete practitioners: excellent bedside manner, patient advocates, supporters of ancillary staff, and can disseminate their knowledge to others in a meaningful, respectful, and understandable fashion. A few that come to mind are Josh Broder, Randall Best, Bob Hoffman and David Manthey, but that is far from an exhaustive list!

blank
3. Any advice for other clerkship directors?
Listen to your students. They are a great resource for what is working, or not working, on the rotation. And don’t be afraid to change things within the clerkship experience. Just because “its always been done this way…” doesn’t mean its the best way!

blank
4. What is your favorite part about being and educator/director?
My favorite part about working in an academic setting is that we get to wear so many hats: educator, mentor, physician, administrator, advocate. We as EM physicians are often attracted to the specialty because of the breadth of patients and complaints that we see clinically, and academic EM provides the opportunity for variability/versatility adjacent to the clinical realm. In my case, being involved in education (student and/or resident) really inspires me to keep learning so that I can effectively guide and educate the future physicians of the specialty.

blank
5. Any interesting factoids you would like to share?
imagesMy fourth year of medical school was interrupted when Hurricane Katrina devastated New Orleans in 2005. I spent the next 5 months doing away rotations, allowing me to spend significant time in some of America’s great hospitals and cities. Side note, Katrina was the 3rd of 5 hurricanes to effect me that summer and fall of 2005. You’ve heard of “storm chasers”, well it felt like I was the one being chased!

 

CDEM Voice – FOAMonthly

foamonthly-title

http://scienceforwork.com/blog/virtual-teams-trust/

In our academic lives, we constantly work in teams and increasingly, we are working in virtual teams. Would you like to apply the best evidence from the organizational and management literature to make your virtual teams effective? If so, read this post from the website Science for Work to learn how to build trust in your virtual teams—increased trust leads to improved team performance and everyone benefits!

When working in a virtual team, trust takes on even greater importance as there are less social cues, poorer understanding of tasks, and increased risk for conflicts and role ambiguity. Thus, trust becomes more essential to effective team functioning. Here are four tips to improve trust on your virtual team. First, get to know your team. If possible, use social activities to engage each other and build relationships. Second, be trustworthy. Team members trust each other based on three factors: competence, integrity, and benevolence. Therefore, strive to display competence, integrity, and benevolence to increase your trustworthiness. Third, clearly assign roles and tasks. Documenting interactions through the use of recorded videos or chats can help with this. Lastly, work to maintain trust. Once trust is established it is important to maintain trust through frequent team interactions. If you want to learn more about evidence-based management, Science for Work (http://scienceforwork.com/) has ample reading with clearly distilled take away points!

Kendra Parekh, MD

parekh-signature

CDEM Voice – Member Highlight

capture

Laura Thompson

 

 



Laura Thompson, MD MS

Assistant Professor
Department of Emergency Medicine
OSU Wexner Medical Center



 

  1. What is your most memorable moment of teaching?

I love the moment when you can help a student figure out if a patient is “sick” or “not sick.” It is an incredibly important teaching point, and after a student has 3-4 years of classroom and clinical learning in med school, it is great to see it all come together.

  1. Who or what has been your greatest influence?

My dad believed in a life of service, and I see education and medicine as two fields that intersect with service to students, patients, and society. My mom always juggled work and family, and has been a role model as I’ve started my career.

  1. Any advice for other clerkship directors?

There is a balance to being a clerkship director – being a student advocate and holding your students to high standards to help them become great physicians. It was initially challenging to be the one to call students out when they weren’t performing well in one area. But I think it is perhaps those students we can help the most – if we can identify the gaps in knowledge or in skills, it becomes so much easier to train the next generation of physicians.

  1. What is your favorite part about being and educator/director?

I love clinical teaching and finding the one or two major points per shift that a student can walk away with a have as a new skill or new skills.

  1. Any interesting factoids you would like to share?

I tell my trainees that you never know where you will learn your leadership skills for running a code or an arrest. When I was a resident, I was in the CTICU on an overnight with a patient in extremis. After things were managed, one of the more seasoned nurses turned to me and said “Were you a coxswain or something??” Indeed – I was a coxswain for about 6 years, and those leadership skills have helped me manage many difficult situations. So, I encourage my mentees to work hard in whatever they pursue, and realize the arenas of work and play are not always so different.

coxswain

CDEM Voice – FOAMonthly

foamonthly-title

Stimulating Active Learning: Audience Response Systems

Online Source: https://icenetblog.royalcollege.ca/2016/12/20/audience-response-systems-for-teaching-and-talks-why-and-how/

Compared to the traditional passive lecture, active learning methods can increase student participation and motivation, promote critical thinking skills and even increase knowledge retention. Think-pair-share, flipped classroom, gamification, and team based learning are all examples of methods to promote active learning. Unfortunately, consistently incorporating these into the clerkship didactics can be difficult, especially with a rotating set of faculty volunteer lecturers and variable student engagement. One way to promote active learning in a structured didactic format is through the use of audience response systems.

This post, from the ICE blog, provides a nice overview of audience response systems and highlights several audience response technologies. By forcing students to commit to an answer, these systems provide learners with real-time feedback about their knowledge gaps in a low-stakes environment. This can be especially helpful for engaging the quieter students. These systems can also be used to provide accountability for any pre-reading or facilitate team competitions. For the instructor, the class responses can guide the focus of the discussion. More advanced technologies can track a learner’s progress over time to assist with formative feedback. These audience response systems, however, are a tool and not an active learning method in and of themselves. They are not a substitute for well-written questions or effective teaching styles. Nevertheless, this technology can serve as an accessible means to promote active learning and a great resource for colleagues searching for ways to develop more interactive teaching sessions.

Laura Welsh, MD

Medical Education Fellow

Division of Emergency Medicine

University of Washington School of Medicine

CDEM Voice – Topic 360

Screen Shot 2017-05-31 at 1.45.13 PM

 

“If you want to know how we practiced medicine 5 years ago, read a textbook.
If you want to know how we practiced medicine 2 years ago, read a journal.
If you want to know how we practice medicine now, go to a (good) conference.
If you want to know how we will practice medicine in the future, listen in the hallways and use FOAM.”

               – International EM Education Efforts & E-Learning by Joe Lex 2012

 

Since the movement of Free Open Access Medical Education (FOAM) started in 2012, many emergency practitioners and educators have adopted this concept to disseminate information to the medical community. FOAM is an independent platform that includes but is not restricted to blogs, online videos, twitter hashtags, webpage applications and podcasts. The current trend in education has expanded beyond textbooks, lectures, and peer-reviewed articles. FOAM allows for new and updated medical information to be distributed in a timely manner, anytime, anywhere, with the capability of interacting directly with the authors. FOAM is not just a concept; it has become an ideology.

Despite the growing use of FOAM, there are several professionalism issues that we as educators and researchers need to consider. For instance, who is to be blamed if a medical error occurs from using FOAM in patient care? How can you rate the quality of the information you are reviewing? To investigate these issues, Academic Life in Emergency Medicine introduced the concept of an Approved Instructional Resources (AIR) series. In this series, a nine-person executive board of clinicians created a 5-question rubric score. This tool can be used by medical educators to rate online resources and better evaluate the quality of the information to further help their learners effectively utilize FOAM resources.

Another issue that has not yet been addressed by the medical education community is how to maintain ownership when reviewing, sharing, or creating a FOAM idea. FOAM is defined as “open access,” which means: “free availability to the public internet permitting any user to read and distribute without financial, legal or technical barrier.” This is a beauty and a curse at the same time. Although it provides users unrestricted access to educational materials, it does not provide a copyright to authors over the integrity of their work and the right to be appropriately acknowledged and cited. One might assume that since we are in a highly professional field, users will follow common ethics and professionalism when it comes to sharing and crediting FOAM content. However, there have been instances where an individual publicly shared their innovative idea that was then translated into a successful project by another individual with no mention to the originality of the project.

A discussion on Life in The Fast Lane suggested composing a FOAM charter or a code, whereby FOAM creators register and are given a special “stamp” which indicates that they have adhered to the principles of ethical use and creation of FOAM. However, who should be appointed to the committee remains unclear.

As medical educators, we should discuss these issues with our learners. Until the medical education community comes forward with consensus on its use, we are relying on the current users of FOAM to challenge contributors, question the evidence, and maintain academic integrity.

 

Layla Salman Abubshait, MD

Medical Education Fellow

Department of Emergency Medicine

Beth Israel Deaconess Medical Center

 

Reference:

  1. Chan, Teresa Man-Yee, Andrew Grock, Michael Paddock, Kulamakan Kulasegaram, Lalena M. Yarris, and Michelle Lin. “Examining Reliability and Validity of an Online Score (ALiEM AIR) for Rating Free Open Access Medical Education Resources.” Annals of Emergency Medicine6 (2016): 729-35. Web.
  2. Nickson, Chris. “Time for a FOAM Charter?” Blog post. Life in the Fast Lane. Chris Nickson, 28 July 2013. Web. <https://lifeinthefastlane.com/time-for-a-foam-charter/&gt;.