Study Compares ChatGPT’s Performance to Students in Accounting Exams

Study Compares ChatGPT’s Performance to Students in Accounting Exams

A study conducted by researchers from Brigham Young University (BYU) and 186 other institutions revealed that students outperformed OpenAI’s ChatGPT in accounting exams, scoring an average of 76.7% against the AI’s 47.4%. Despite this result, ChatGPT’s performance was still deemed “impressive” and a “game changer” for the future of education. The study was published in the journal Issues in Accounting Education.

ChatGPT’s Strengths and Weaknesses in Accounting

In 11.3% of the questions, ChatGPT surpassed the student average, performing exceptionally well in accounting information systems (AIS) and auditing. However, it underperformed in tax, financial, and managerial assessments. This disparity may be attributed to the AI’s difficulties with the mathematical processes involved in these subjects.

Question Types and ChatGPT’s Performance

ChatGPT performed better on true/false questions (68.7% correct) and multiple-choice questions (59.5%) but struggled with short-answer questions (scoring between 28.7% and 39.1%). Researchers observed that higher-order questions were more challenging for the AI to answer. In some instances, ChatGPT provided convincing yet incorrect written descriptions or answered the same question inconsistently.

Concerns Regarding ChatGPT’s Errors and Fabrications

Researchers highlighted that ChatGPT occasionally made up facts or provided explanations for its answers even when they were incorrect. For example, the AI generated seemingly genuine references that turned out to be entirely fabricated. Additionally, ChatGPT made illogical mathematical errors, such as adding two numbers in a subtraction problem or dividing numbers inaccurately.

Collaborative Study Investigating ChatGPT’s Educational Impact

In light of the ongoing debate about the role of AI models like ChatGPT in education, lead study author David Wood, a BYU professor of accounting, sought to involve as many professors as possible in the research. The study ultimately included 327 co-authors from 186 educational institutions across 14 countries, who contributed 25,181 classroom accounting exam questions. Additionally, undergraduate BYU students supplied another 2,268 textbook test bank questions for ChatGPT to answer, covering AIS, auditing, financial accounting, managerial accounting, and tax, with varying levels of difficulty and question types (true/false, multiple choice, short answer).

Technology

Articles You May Like

Ford CEO and Tesla CEO to Discuss Accelerating EV Adoption on Twitter Spaces
Medical Experts Speak Out on Various Topics
Analysts Predict Further Growth for Japanese Stocks
Florida Governor Ron DeSantis Vows to Pardon Victims of “Political Targeting” on Day One of Presidency

Leave a Reply

Your email address will not be published. Required fields are marked *