Exportar este item: EndNote BibTex

Use este identificador para citar ou linkar para este item: http://tede2.unifenas.br:8080/jspui/handle/jspui/198
Registro completo de metadados
Campo DCValorIdioma
dc.creatorMatos, Flávia Soares de-
dc.creator.Latteshttp://lattes.cnpq.br/2039015899640578por
dc.contributor.advisor1Toledo Junior, Antônio Carlos de Castro-
dc.contributor.advisor1Latteshttp://lattes.cnpq.br/2192040830710780por
dc.contributor.referee1Santos, Rodrigo Ribeiro dos-
dc.contributor.referee1Latteshttp://lattes.cnpq.br/2565457057115441por
dc.contributor.referee2Santos, Silvana Maria Eloi-
dc.contributor.referee2Latteshttp://lattes.cnpq.br/3155049426520303por
dc.date.accessioned2018-07-31T16:58:49Z-
dc.date.issued2018-02-26-
dc.identifier.citationMATOS, Flávia Soares de. A prova prática-oral estruturada é comparável a uma estação do exame clínico objetivo estruturado, na avaliação de habilidades clínicas em estudantes de medicina? Estudo experimental, 2017. 2018. 45f. Dissertação (Mestrado em Ensino em Saúde) - Universidade José do Rosário Vellano, Belo Horizonte, 2018.por
dc.identifier.urihttp://tede2.unifenas.br:8080/jspui/handle/jspui/198-
dc.description.resumoApesar do exame clínico objetivo estruturado (OSCE) ser considerado padrão ouro para avaliação de habilidades clínicas, ele é uma prova de organização complexa e de alto custo. Por outro lado, as provas do tipo prática-oral estruturada (POE) são de aplicação mais simples e de menor custo, ainda que possam apresentar menor validade e confiabilidade. Objetivo: comparar o desempenho acadêmico e a percepção de alunos de Medicina na avaliação de habilidades por OSCE e POE. Método: foram elaboradas duas provas (OSCE e POE) para avaliação de cinco habilidades obstétricas em manequim (três primeiras manobras de Leopold, medida de útero-fita e ausculta de batimentos cardíacos fetais). A POE avaliou as habilidades isoladamente e o OSCE avaliou as habilidades após a análise de um caso clínico contextualizado. Estudantes do 4º período do curso de Medicina foram distribuídos em dois grupos. Na primeira fase, o Grupo 1 realizou a POE e o Grupo 2 o OSCE. Na segunda fase, 3 semanas após, aplicou-se novamente as mesmas provas, de modo invertido. As provas foram aplicadas por um único avaliador, que utilizou o mesmo checklist nas duas provas. Na segunda fase, aplicou-se também um questionário sobre a percepção dos alunos em relação aos dois tipos de prova. Comparou-se a nota média em cada questão e a nota total nos seguintes cruzamentos: tipo de prova em cada uma das fases; tipo de prova independentemente da fase; OSCE e POE intragrupo e notas da primeira e da segunda fases, independentemente do tipo de prova, bem como entre os grupos, independentemente da fase. A percepção do aluno foi analisada por distribuição de frequência e agrupamento das respostas abertas por semelhança. Resultados: 21 alunos participaram do estudo, sendo 13 do Grupo 1 e oito do Grupo 2. Não houve diferença entre as notas das questões e a nota total, entre os dois tipos de prova, nas duas fases do estudo. Também não se observou diferenças em todos os outros cruzamentos, exceto quando se comparou as notas entre as fases do estudo. Na comparação intragrupo, a nota total e a nota da questão 2 (útero-fita) foi superior na segunda fase nos dois grupos. A comparação entre as médias das notas na primeira fase e na segunda fase, independentemente do tipo de prova, demonstrou que as notas da segunda fase foram superiores na nota total e em todas as questões, exceto na 2ª e 3ª manobras de Leopold. A melhora das notas, no segundo dia, pode estar relacionada ao efeito teste. Conclusão: o tipo de prova não influenciou o desempenho do aluno. A maioria dos alunos preferiu a prova tipo OSCE.por
dc.description.abstractAlthough the objective structured clinical examination (OSCE) is considered golden standard for the assessment of clinical skills, the exam is both expensive and complex. On the other hand, structured oral examinations (SOE) have much simpler application and lower cost. Nonetheless, SOE may have lower validity and reliability. Objective: To compare Medical students’ academic performance and perception in regard of assessing skills through OSCE and SOE. Method: Two tests were designed (OSCE and SOE) to evaluate five obstetrical skills in a mannequin (the first three Leopold maneuvers, fundal height measurement, and fetal heart rate auscultation). The SOE assessed skills separately and the OSCE evaluated the skills after the analysis of a contextualized case. Students of the 4th term of Medical School were distributed into two groups. In the first phase of the study, Group 1 performed SOE, and Group 2, OSCE. Three weeks later, in the second phase, the same tests were applied inversely. Tests were applied by the same teacher, who employed the same checklist in both tests. In the second phase, a survey about students’ perception towards both test types was also applied. The mean score in each question was compared, as well as the final score, concerning the following crossings: test type in each of the two phases; test type regardless of phase; intragroup OSCE and SOE, and scores of the first and second stages, regardless of test type, as well as between groups, regardless of phase. Students’ perception was analyzed according to frequency distribution and grouping of open-ended answers by similarity. Results: 21 students participated in the study: 13 in Group 1, and 8 in Group 2. No difference was found in all the other crossings, except for the scores between phases. As for the intragroup comparison, the final score and the question 2 (fundal height measurement) score were superior in the second phase in both groups. The comparison between mean scores of the first and second phases, regardless of test type, demonstrated that the scores of the second phase were superior in the final score and in all questions, except for the second and third Leopold maneuvers. The scores’ improvement on the second day may be related to the testing effect. Conclusion: the test type did not influence students’ performance. Most candidates preferred the OSCE type.eng
dc.description.provenanceSubmitted by Kely Alves (kely.alves@unifenas.br) on 2018-07-31T16:56:35Z No. of bitstreams: 1 Dissertação Flávia Matos.pdf: 868333 bytes, checksum: 75207be4679ddfd22055131d0846333d (MD5)eng
dc.description.provenanceApproved for entry into archive by Kely Alves (kely.alves@unifenas.br) on 2018-07-31T16:57:10Z (GMT) No. of bitstreams: 1 Dissertação Flávia Matos.pdf: 868333 bytes, checksum: 75207be4679ddfd22055131d0846333d (MD5)eng
dc.description.provenanceApproved for entry into archive by Kely Alves (kely.alves@unifenas.br) on 2018-07-31T16:58:19Z (GMT) No. of bitstreams: 1 Dissertação Flávia Matos.pdf: 868333 bytes, checksum: 75207be4679ddfd22055131d0846333d (MD5)eng
dc.description.provenanceMade available in DSpace on 2018-07-31T16:58:49Z (GMT). No. of bitstreams: 1 Dissertação Flávia Matos.pdf: 868333 bytes, checksum: 75207be4679ddfd22055131d0846333d (MD5) Previous issue date: 2018-02-26eng
dc.formatapplication/pdf*
dc.languageporpor
dc.publisherUniversidade José do Rosário Vellanopor
dc.publisher.departmentPós-Graduaçãopor
dc.publisher.countryBrasilpor
dc.publisher.initialsUNIFENASpor
dc.publisher.programPrograma de Pós-Graduação em Saúdepor
dc.relation.referencesAL OMARI, A.; SHAWAGFA, Z. M. New experience with objective structured clinical examination in Jordan. Rawal Medical Journal, [S.l.], v. 35, n. 1, p. 78-81, 2010. AL-ELQ, A. H. Medicine and clinical skills laboratories. Journal of Family Community and Medicine, [S.l.], v. 14, n. 2, p. 59-63, 2007. AMIRI, M.; NICKBAKHT, M. The Objective Structured Clinical Examination: A study on satisfaction of students, faculty members, and tutors. Life Science Journal, [S.l.], v. 9, n. 4, p. 4909-4911, 2012. ANASTAKIS, D. J.; COHEN, R.; REZNICK, R. K. The Structured oral examination as a method for assessment surgical residents. The American Journal of Surgery, [S.l.], v. 162, p. 67-70, 1991. BRADLEY, P. The history of simulation in medical education and possible future directions. Medical Education, [S.l.], v. 40, p. 254-262, 2006. BUTLER, A. C. Repeated testing produces superior transfer of learning relative to repeated studying. Journal of Experimental Psychology: Learning, Memory and Cognition, [S.l.], v. 36, n. 5, p. 1118-1133, 2010. DAVIS, M. H. OSCE: The Dundee experience. Medical Teacher, [S.l.], v. 25, n. 3, p. 255-261, 2003. DAVIS, M. H.; KARUNATHILAKE, I. The place of the oral examination in today’s assessment systems. Medical Teacher, [S.l.], v. 27, n. 4, p. 294-297, 2005. DIAS, R. D.; SCALABRINI NETO, A. Importância do laboratório de habilidades na capacitação e avaliação prática de alunos e profissionais da área da saúde. In: TIBÉRIO, I. F. L. C. et al. Avaliação prática de habilidades clínicas em Medicina. Rio de Janeiro: Atheneu, 2012. cap. 10, p. 120. EPSTEIN, R. M. Assessment in Medical Education. New England Journal of Medicine, [S.l.], v. 356, n. 4, p. 387-396, 2007. GORMLEY, G. Summative OSCEs in undergraduate medical education. Ulster Medical Journal, [S.l.], v. 80, n. 3, p. 127-132, 2011. HARDEN, R. M.; GLEESON, F. A. Assessment of clinical competence using an objective structured clinical examination (OSCE). Medical Education, [S.l.], v. 13, p. 41-54, 1979. HARDEN, R. M. et al. Assessment of clinical competence using objective structured examination. British Medical Journal, [S.l.], v. 1, p. 447-451, 1975. HARDEN, R. M. Misconceptions and the OSCE. Medical Teacher, [S.l.], v. 37, n. 7, p. 608-610, 2015. HARDEN, R. M. Revisiting ‘Assessment of clinical competence using an objective structured clinical examination (OSCE)’. Medical Education, [S.l.], v. 50, p. 376-379, 2016. HARDEN, R. M. What is an OSCE? Medical Teacher, [S.l.], v. 10, n. 1, p. 19-22, 1988. HASHIM, R. et al. Structured viva as an assessment tool: perceptions of undergraduate medical students. Pakistan Armed Forces Medical Journal, [S.l.], v. 65, n. 1, p. 141-144, 2015. JAYAWICKRAMARAJAH, P. T. Oral examinations in medical education. Medical Education, [S.l.], v. 19, p. 290-293, 1985. KHAN, H. M.; MIRZA, T.M. Perceptions of oral structured examination: a move from subjectivity to objectivity. Pakistan Armed Forces Medical Journal, [S.l.], v. 67, n. 1, p. 41-46, 2017. KHAN, K. Z. et al. The Objective Structured Clinical Examination (OSCE): AMEE Guide No. 81. Part I: An historical and theoretical perspective. Medical Teacher, [S.l.], v. 35, n. 9, p. e1437-e1446, 2013a. KHAN, K. Z. et al. The Objective Structured Clinical Examination (OSCE): AMEE Guide No. 81. Part II: Organisation & Administration. Medical Teacher, [S.l.], v. 35, n. 9, p. e1447-e1463, 2013b. KHAN, K. Z.; PATTISON, T.; SHERWOOD, M. Simulation in medical education. Medical Teacher, [S.l.], v. 33, p. 1-3, 2011. KROMANN, C. B.; JENSEN, M. L.; RINGSTED, C. The effect of testing on skills learning. Medical Education, [S.l.], v. 43, p. 21-27, 2009. LUNENFELD, E. et al. Assessment of emergency medicine: a comparison of an experimental objective structured clinical examination with a practical examination. Medical Education, [S.l.], v. 25, p. 38-44, 1991. MATHEWS, L.; MENON, J.; MANI, N. S. Micro-OSCE for assessment of undergraduates. Indian Pediatrics, [S.l.], v. 41, p. 159-163, 2004. MEMON, M. A.; JOUGHIN, G. R.; MEMON, B. Oral assessment and postgraduate medical examinations: establishing conditions for validity, reliability and fairness. Advances in Health Science Education, [S.l.], v. 15, p. 277-289, 2010. MILLER, G. E. The assessment of clinical skills/competence/performance. Academic Medicine, [S.l.], v. 65, n. 9, p. S63-S67, 1990. MONDAL, R. et al. Comparative analysis between Objective Structured Clinical Examination (OSCE) and Conventional Examination (CE) as a formative evaluation tool in Pediatrics in semester examination for final MBBS students. Kathmandu University Medical Journal, [S.l.], v. 37, n. 1, p. 62-65, 2012. MORAN, M. E. Enlightenment via simulation: ‘‘Crone-ology’s’’ First Woman. Journal of Endourology, [S.l.], v. 24, n. 1, p. 5-8, 2010. OKUDA, Y. et al. The utility of simulation in Medical Education: What is the evidence? Mount Sinai Journal of Medicine, [S.l.], v. 76, p. 330-343, 2009. PASTURA, P. S. V. C.; SANTORO-LOPES, G. O aprendizado melhorado por provas. Revista Brasileira de Educação Médica, [S.l.], v. 37, n. 3, p. 429-433, 2013. RAZAVI, S. M. et al. Station-based deconstructed training model for teaching procedural skills to medical students: a quasi-experimental study. Advances in Medical Education and Practice, [S.l.], v. 1, p. 17-23, 2010. ROEDIGER III, H. L.; KARPICKE, J. D. Test-enhanced learning taking memory tests improves long-term retention. In: Psychological Science, [S.l.], v. 17, n. 3, p. 249-255, 2006. SMEE, S. ABC of learning and teaching in medicine. Skill based assessment. British Medical Journal, [S.l.], v. 326, p.703-706, 2003. VAN DER VLEUTEN, C. et al. Assessment of professional competence: building blocks for theory development. Best Practice & Research Clinical Obstetrics and Gynaecology, [S.l.], v. 24, p. 703-719, 2010. WASS, V. et al. Achieving acceptable reliability in oral examinations: an analysis of the Royal College of General Practitioners membership examination’s oral component. Medical Education, [S.l.], v. 37, p. 126-131, 2003. WASS, V. et al. Assessment of clinical competence. The Lancet, [S.l.], v. 357, p. 945-949, 2001.por
dc.rightsAcesso Abertopor
dc.subjectEducação Médica. Avaliação Educacional. Competência Clínica.por
dc.subjectMedical Education. Educational Assessment. Clinical Competence.eng
dc.subject.cnpqCIENCIAS DA SAUDE::MEDICINApor
dc.titleA prova prática-oral estruturada é comparável a uma estação do exame clínico objetivo estruturado, na avaliação de habilidades clínicas em estudantes de medicina? Estudo experimental, 2017.por
dc.typeDissertaçãopor
Aparece nas coleções:Programa de Mestrado em Ensino em Saúde

Arquivos associados a este item:
Arquivo Descrição TamanhoFormato 
Dissertação Flávia Matos.pdfDocumento principal847,98 kBAdobe PDFBaixar/Abrir Pré-Visualizar


Os itens no repositório estão protegidos por copyright, com todos os direitos reservados, salvo quando é indicado o contrário.