EMRA Hangout Invitation: 11/7/2017 (FEMinEM)

CLICK HERE TO SUBMIT QUESTIONS:

We are excited to extend an invitation for you to take part in EMRA’s exciting webinar style series EMRA Hangouts. Each month EMRA Hangouts host a featured Emergency Medicine program director and other key speakers as they share their key insights and information on everything from the match process, away rotations, interviews, and more!

This Hangout is a special edition in collaboration with FemInEM and features a discussion with female physicians and residents focused on women’s paths through medical education and the specific challenges faced by female students while applying to and working in emergency medicine. This session is called:

WOMEN IN EMERGENCY MEDICINE PANEL WITH FEMinEM

Featuring:

Dara Kass, MD
  CEO and Editor-in-chief of FemInEM
Deborah Pierce, DO
Residency Program Director, Einstein Healthcare Network
Shana Ross, DO
Assistant Program Director, Assistant Professor of Emergency Medicine
Megan Stobart-Gallagher, DO
Assistant Program Director, Undergraduate Medical Director, Einstein Healthcare Network

Topic: “Women in Emergency Medicine Panel w/ FemInEM”

Date: November 7, 8PM EST 

Time: 8PM Eastern | 7PM Central|6PM
                 Mountain |5PM Pacific  

SIGN UP FOR WOMEN IN EMERGENCY MEDICINE PANEL

For more information on past and future sessions make sure to visit the EMRA Hangouts website at: https://www.emra.org/students/hangouts/

 

CDEM Voice – Research Column

er-column

Using Surveys in Medical Education Research

Surveys are a common means of acquiring data in medical education research. They can be used to perform needs assessments for curriculum design, evaluate learner and instructor perceptions, or gather information about the attitudes and preferences of individuals in the educational environment.

Surveys are best used to obtain subjective data, such as opinions or attitudes, as these cannot be observed directly. Directly observable data may be gained by various means, such as direct measurement (e.g. using an exam to test content retention), database review (e.g. analyzing student academic records), or third-party observation (e.g. outsourcing analysis of videotaped procedural performance). In contrast, surveys ask respondents to report or recall information, and as such their findings risk being subject to various cognitive biases such as recall bias. For this reason, data should only be obtained by a survey if they are otherwise unavailable due to the impracticality or impossibility of more direct assessment.

When designed and applied properly, surveys can achieve a scientific rigor comparable to psychometric tools. If relevant to the researcher’s questions, previously validated survey tools should be used, as these are more likely to yield high-quality responses and allow for comparison against pre-existing data. Many questions in medical education research may require the construction of new surveys, however, as pre-existing survey questions may not fully capture the information relevant to the study at hand.

Artino et al. describe seven steps in developing a survey tool to maximize its scientific rigor: (1) conduct a literature review to look for pre-existing survey tools or items, and to align the new survey with existing literature; (2) conduct interviews or focus groups to help understand how the population of interest conceptualizes the relevant questions; (3) synthesize the first two steps to ensure that the ideas under investigation a) use language that the population of interest can understand, and b) that these ideas make theoretical sense based on the literature; (4) develop survey items; (5) conduct expert review of these items to assess for content validity; (6) vet these items with members of the respondent population to ensure understanding in the manner that the investigators intend; and (7) run a pilot survey to evaluate response variance and conduct a formal analysis of the content validity of each survey item. While the performance of each of these steps may be too cumbersome for the construction of a given survey, incorporating as many as possible will increase the quality and reproducibility of survey responses and will make the data more reliable for comparison against future studies.

Furthermore, careful design facilitates the reporting of many important survey considerations. A robust evaluation of the study population will enable clearer description of sampling criteria, which will add to a survey’s reproducibility and indicate the generalizability of its results. Content review and item vetting not only improve the quality of data collected by individual questions, but also help identify the best means of response measurement. For example, an opinion question might utilize a Likert scale or free responses, and these two types of data demand very different analyses. Identifying which response type is best suited to gathering the data of interest is crucial for both data acquisition and reporting.

Finally, carefully crafted survey items avoid ambiguity or confusing language. This helps reduce erroneous responses, and thus makes inaccurate conclusions less likely. Clear wording can also improve the response rate, which is a crucial piece of information for any survey study to report. As in all study types, achieving appropriate power helps reduce the risk of type I and type II errors, and in survey methodology power is reflected by the response rate as well as the sample size.

Surveys can be excellent tools for elucidating subjective or otherwise non-observable data. With careful design and appropriate application, they can attain a high degree of scientific rigor.check box

 

Andrew Ketterer, MD

Medical Education Fellow

Beth Israel Deaconess Medical Center/Harvard Medical School

References:

Artino AR, La Rochelle JS, Dezee KJ, et al. Developing questionnaires for educational research. AMEE Guide No. 87. 2014(36):463-74.

Phillips AW. Proper applications for surveys as a study methodology. Brief Educational Advances. December 5, 2016. 18(1).

 

CDEM Voice – FOAMonthly

foamonthly-title

A Heaping Spoonful of FOAM Literature Reviews


Site: https://journalfeed.org

Are you tired of hearing about “that cool new study” that your student heard about on a podcast, but can’t cite? Do you find the “why do we do this?” question difficult to answer on shift? Looking to direct your students to landmark EM papers that guide our practice?  This month’s FOAMonthly highlights JournalFeed, formerly EM Topics, which promises to spoon feed us the most important EM literature.

Clay Smith, MD (@spoonfedEM), created JournalFeed to meet the broad needs of novice to advanced learners who need some help staying informed.  Via an almost continuously updating website, daily emails and weekly summary emails, Dr. Smith uses the “spaced repetition” model of learning. After he combs through all of the new EM literature in a month (>1500 articles from 31 journals), Dr. Smith selects articles that seem the most relevant to an EM physician’s practice and posts a summary/emails one reviewed article per day. While each review does link to the primary source, there is also a summary of the highlights, including a single sentence “Spoon Feed” summary, a “why does this matter” section and a brief discussion of the findings and methods of each article.  The website is open access, and daily/weekly emails are via a free subscription.

For our most novice learners, the “Landmark Papers” section will be the most informative. From articles on vent management, stroke care, antibiotic stewardship, to health policy and ED operations, there’s something for every interest.  Best of all, summaries are short, concise and relevant to daily practice.

 

Happy Spoonfeeding,

 

Emily Brumfield MD

@DrSadieHawkins

Assistant Professor

Assistant Director of Undergraduate Medical Education

Vanderbilt University Medical Center

CDEM Faculty Podcast Episode 6

Episode 6:  National Clinical Assessment Tool for Emergency Medicine (NCAT-EM)

 

Hi everyone!  In this month’s podcast I had the opportunity to talk with Dr. Kathy Hiller about  the recently released National Clinical Assessment Tool for Emergency Medicine (NCAT-EM), which was developed by Jules Jung, Doug Franzen, Kathy Hiller, and Luan Lawson.  We last talked with this group in episode 2 (https://cdemcurriculum.com/2016/03/26/end-of-shift-assessment-of-medical-students/), when they were still in the beginning stages of developing this evaluation tool.  In this interview, Dr. Hiller discusses how you may use the NCAT-EM (see link below) in your clerkship and their current study.  If you are interested in using the NCAT-EM and/or would like to be a part of the study, please email Dr. Hiller at khiller@aemrc.arizona.edu.

NCAT-EM

Keep up to date on the latest CDEM Faculty podcast on SoundCloud, RSS, and iTunes.  

Suzana Tsao, DO

Director Medical Student Clerkship

Perelman School of Medicine

CDEM Voice – Member Highlight

capture


Lexington

David Story MD, FACEP

B.S., Biochemistry, Louisiana State University & A&M College, 2002

M.D., Louisiana State University Health Sciences Center in New Orleans, 2006

Residency, Emergency Medicine, Duke University Hospital, 2009

Fellowship, Medical Toxicology, NYU/Bellevue/NYC-Poison Control Center, 2010

Physician, Emergency Department, Lower Manhattan Hospital (formerly New York Downtown Hospital), 2009-2011

Assistant Professor, Department of Emergency Medicine, Wake Forest University Medical Center, 2011-present


 

1. What is your most memorable moment of teaching?
There isn’t a specific moment, but I love seeing that point at which a certain idea “clicks” with a student and their eyes light up with the sudden understanding of something that was previously unknown.

blank
2. Who or what is your biggest influence?
I have been blessed to have been surrounded by several outstanding clinician educators during my training and practice years. Physicians who not only have an extraordinary knowledge base, but are complete practitioners: excellent bedside manner, patient advocates, supporters of ancillary staff, and can disseminate their knowledge to others in a meaningful, respectful, and understandable fashion. A few that come to mind are Josh Broder, Randall Best, Bob Hoffman and David Manthey, but that is far from an exhaustive list!

blank
3. Any advice for other clerkship directors?
Listen to your students. They are a great resource for what is working, or not working, on the rotation. And don’t be afraid to change things within the clerkship experience. Just because “its always been done this way…” doesn’t mean its the best way!

blank
4. What is your favorite part about being and educator/director?
My favorite part about working in an academic setting is that we get to wear so many hats: educator, mentor, physician, administrator, advocate. We as EM physicians are often attracted to the specialty because of the breadth of patients and complaints that we see clinically, and academic EM provides the opportunity for variability/versatility adjacent to the clinical realm. In my case, being involved in education (student and/or resident) really inspires me to keep learning so that I can effectively guide and educate the future physicians of the specialty.

blank
5. Any interesting factoids you would like to share?
imagesMy fourth year of medical school was interrupted when Hurricane Katrina devastated New Orleans in 2005. I spent the next 5 months doing away rotations, allowing me to spend significant time in some of America’s great hospitals and cities. Side note, Katrina was the 3rd of 5 hurricanes to effect me that summer and fall of 2005. You’ve heard of “storm chasers”, well it felt like I was the one being chased!

 

CDEM Voice – FOAMonthly

foamonthly-title

http://scienceforwork.com/blog/virtual-teams-trust/

In our academic lives, we constantly work in teams and increasingly, we are working in virtual teams. Would you like to apply the best evidence from the organizational and management literature to make your virtual teams effective? If so, read this post from the website Science for Work to learn how to build trust in your virtual teams—increased trust leads to improved team performance and everyone benefits!

When working in a virtual team, trust takes on even greater importance as there are less social cues, poorer understanding of tasks, and increased risk for conflicts and role ambiguity. Thus, trust becomes more essential to effective team functioning. Here are four tips to improve trust on your virtual team. First, get to know your team. If possible, use social activities to engage each other and build relationships. Second, be trustworthy. Team members trust each other based on three factors: competence, integrity, and benevolence. Therefore, strive to display competence, integrity, and benevolence to increase your trustworthiness. Third, clearly assign roles and tasks. Documenting interactions through the use of recorded videos or chats can help with this. Lastly, work to maintain trust. Once trust is established it is important to maintain trust through frequent team interactions. If you want to learn more about evidence-based management, Science for Work (http://scienceforwork.com/) has ample reading with clearly distilled take away points!

Kendra Parekh, MD

parekh-signature

SAEM18 Call For Didactics

The Program Committee of the Society for Academic Emergency Medicine invites proposals for didactic sessions for the 2018 SAEM Annual Meeting.

Didactics will be selected to provide a robust educational experience during SAEM18. All proposals should support the mission of SAEM: “To lead the advancement of emergency care through education and research, advocacy, and professional development in academic emergency medicine.” Didactics may have a broad or focused audience.  

SAEM18 will place a premium on innovative and interactive didactic sessions. Accepted didactics will be roughly split between two didactic formats:

1) Focused session (20 minutes, generally 1 or 2 speakers)

• Appropriate for most didactic sessions

• Format requires a precise, well-honed presentation

2) Expanded session (50 minutes, generally multiple speakers)  

 

Successful submissions will require significant interactivity and breadth of content such as:

• Panel discussion

• Lecture or seminar style

• Interactive workshop with small group facilitators

• Submitters should detail reasons for requesting this format during the submission.

Successful didactic proposals will represent state of the art in their content area. 

1) Clinical topics should focus on cutting-edge research and its applications to patient care or future research directions.

2) Administrative proposals may focus on topics such as approaches to systems, quality improvement, staffing, and planning.

3) Medical education sessions can span from teaching skill develop to educational innovations and curricular design.

4) Research session proposals can focus on research methodology and tools as well as topics of interest to both the research or general EM community.

We also encourage our submitters to think creatively about content which they feel would have significant appeal to the SAEM membership, even if it is not represented in one of the categories above.

Please contact didactics@saem.org if you have any questions