STUDY ABSTRACT – Veterinary Teleradiology Reporting Times

STUDY ABSTRACT – Veterinary Teleradiology Reporting Times

Veterinary Teleradiology Reporting Times: A Retrospective Analysis and Future Directions.

STUDY AUTHORS

Seth Wallack, Andrew Fox, Eric Goldman, Aziz Beguliev, Sommer Aweidah

STUDY PRESENTATION

This study abstract was presented as a poster at the Symposium on Artificial Intelligence in Veterinary Medicine
(SAVY 2.0) in Ithaca, NY | May 16-18, 2025

Thumbnail of poster - click to view full screen details from the poster are captured by this blog article.

Click image to view the full poster

Background

In human radiology, several metrics are utilized to assess report usefulness, including diagnostic accuracy and Average Time Reporting (ATR). Studies from 2013 and 2017 found reporting times averaged:

  • 98 seconds per thorax image,
  • 111 seconds per abdomen image
  • or ATR of 85 seconds per report, respectively. 1,2,3,4

A study for veterinary ATR for radiographs without and with AI assistance could not be found.5,6

Objectives

  • First objective is to baseline ATR for single region veterinary radiograph reporting WITHOUT AI assistance.
  • Second objective is to baseline and compare ATR for single region veterinary radiograph reporting WITH AI assistance.

Methods

Case Time Period: January 2023 and April 2025 (28 months).

Case Types: Single region reports were grouped into thorax vs. abdomen and further divided into non-STAT vs. STAT. No distinction was made between typing vs. dictating reports or between canines and felines.

Radiologists: Two US board-certified veterinary radiologists (Drs. Seth Wallack, DACVR and Andrew Fox, DACVR) with between 10-and 20-years experience.

Exclusion Criteria: For the reporting with and without AI assistance, exclusion criteria was cases with ATRs <20 seconds or > 20 minutes.

Platform: Both Radiologists used the patented Vetology® reporting platform featuring personalized generative AI conclusions and recommendations derived from the radiologists’ findings.

AI Generation Time: Separately, average AI generation time was calculated for thoracic and abdominal conclusions and recommendations. Two different findings examples (~200 words and ~300 words) were tested for both thoracic and abdominal AI generation to assess the impact of the findings’ word count on generation time. ATR includes all aspects of report creation from claiming to finalizing each single region case.

Results

WITHOUT AI:

    • 1696 total cases.
    • 1367 used for ATR.
    • 329 cases excluded (19%) for >20 minutes.

WITH AI Assistance:

    • 2489 cases.
    • 2280 used for ATR.
    • 209 cases excluded (8%) for > 20 minutes.
Chart showing the average time reporting (ATR) with and without AI

Veterinary Radiologist Average Time Reporting

Graphic showing 2 stopwatches side by side, comparing the ATR. Shows it faster by 24% when AI was used.
Chart showing number of single region cases/hour
table showing the time it takes to complete a 40-case day with and without AI

Comparison of Time to Generate Conclusions + Recommendations Using Personalized AI Model

Table showing data on the time to generate conclusions and recommendations depending on the number of words in the findings.

Conclusions

These results reflect a specific veterinary teleradiology company’s baseline ATR with and without AI prelim reporting, recorded at the given points in time. The performance of the platform itself will continue to improve as AI evolves and processing speeds accelerate.

Using the approach described above, this study provides an initial benchmark for veterinary radiology reporting times and suggests that hybrid AI tools may help improve efficiency.

This is a specific veterinary teleradiology company’s baseline ATR with and without AI prelim reporting. This baseline ATR may be comparable to industry-wide ATR for US and European board-certified veterinary radiologists.

This study demonstrates that veterinary radiograph ATR can be reduced by 24% (range 11-33%) using a personalized hybrid generative and visual AI reporting solution. Vetology AI assistance reporting was shown to increase the number of reports per hour from 6.5 to 8.0.

A hybrid model of radiologist-reported findings and Vetology AI-generated, personalized conclusions and recommendations could reduce a 40-case workday by up to 3 hours.

AI-assisted reporting has the potential to reduce radiologist workload fatigue. Generating personalized AI conclusions and recommendations takes seconds.

Funding Disclosures: Vetology Innovations provided funding for this study through salaries.

References
  1. Cowan IA, MacDonald SL, Floyd RA. Measuring and managing radiologist workload: measuring radiologist reporting times using data from a Radiology Information System. J Med Imaging Radiat Oncol. 2013 Oct;57(5):558-66. doi: 10.1111/1754-9485.12092. Epub 2013 Jul 12. PMID: 24119269.
  2. Rathnayake S, Nautsch F, Goodman TR, Forman HP, Gunabushanam G. Effect of Radiology Study Flow on Report Turnaround Time. AJR Am J Roentgenol. 2017 Dec;209(6):1308-1311. doi: 10.2214/AJR.17.18282. Epub 2017 Oct 5. PMID: 28981363.
  3. Boland GW, Halpern EF, Gazelle GS. Radiologist report turnaround time: impact of pay-for-performance measures. AJR Am J Roentgenol. 2010 Sep;195(3):707-11. doi: 10.2214/AJR.09.4164. PMID: 20729450.
  4. Mityul MI, Gilcrease-Garcia B, Mangano MD, Demertzis JL, Gunn AJ. Radiology Reporting: Current Practices and an Introduction to Patient-Centered Opportunities for Improvement. AJR Am J Roentgenol. 2018 Feb;210(2):376-385. doi: 10.2214/AJR.17.18721. Epub 2017 Nov 15. PMID: 29140114.
  5. Weissman A, Solano M, Taeymans O, Holmes SP, Jiménez D, Barton B. A SURVEY OF RADIOLOGISTS AND REFERRING VETERINARIANS REGARDING IMAGING REPORTS. Vet Radiol Ultrasound. 2016 Mar-Apr;57(2):124-9. doi: 10.1111/vru.12310. Epub 2015 Dec 17. PMID: 26677167.
  6. Adams WM. A survey of radiology reporting practices in veterinary teaching hospitals. Vet Radiol Ultrasound. 1998 Sep-Oct;39(5):482-6. doi: 10.1111/j.1740-8261 1998.tb01638.x. PMID: 9771603.

Want to see our Personalized Prelim Tool in action?

To tour the platform and learn more, contact our team, or book a demo for a firsthand look radiology support tool, click this box, and schedule an appointment!

Vetology’s Approach to AI: Radiologist Consensus in Action

Vetology’s Approach to AI: Radiologist Consensus in Action

Understanding Variability in Veterinary Radiology Interpretations

Veterinary radiology is a common diagnostic tool used to evaluate conditions ranging from orthopedic injuries to internal diseases. However, the reality is that radiologists don’t always agree on an image’s interpretation. Unlike laboratory tests with definitive results, radiology reports are clinical opinions, influenced by individual expertise, experience, and subtle differences in image quality. This variability in diagnosis can lead to differences in treatment recommendations and patient outcomes.

In this article, we look at the factors that influence these discrepancies and how advancements in artificial intelligence (AI)-assisted veterinary radiology can help improve consistency in diagnostic imaging reporting.

Reasons for Variability in Veterinary Radiology

Radiology combines art and science, and differences in image interpretation are common, even among board-certified radiologists. Radiology reports rely on expert opinion, which can vary based on several factors:

  • Subjectivity and cognitive bias — Radiologists rely on pattern recognition to identify abnormalities, and subtle differences in perception and cognitive bias can lead to different conclusions. For example, confirmation bias may make the radiologist see what aligns with their expectations, and anchoring bias can make them stick to an initial assessment.
  • Experience — A radiologist’s experience can influence their interpretative skills and diagnostic approach, and their background can shape how they assess an image. For instance, specialists in orthopedic imaging may emphasize bone structure, while those with soft tissue expertise may focus more on organ abnormalities.
  • Image quality — Underexposed or overexposed images can obscure fine details, and poor patient positioning may affect visibility, leading to misinterpretation.
  • Clinical context — Veterinary radiologists are trained to interpret images without first looking at the patient’s history to keep their assessment objective. That said, a strong clinical history that includes exam findings and relevant background helps shape a more complete and accurate report. The more context a radiologist has, the better they can tailor conclusions and recommendations. In some cases, the same images might lead to different interpretations depending on the clinical details provided.
  • Complex cases — Some conditions, such as early-stage tumors, inconspicuous fractures, or certain lung diseases, can present with subtle or overlapping features, making classification difficult. Differences in how radiologists weigh the significance of these findings can lead to varying interpretations.
  • The human factor — Radiologists are humans, and issues such as fatigue and time constraints can impact diagnostic accuracy. Evaluating hundreds of images per day can also impact a radiologist’s mental focus, and a heavy workload may lead to less thorough evaluations.

Radiologist Consensus and Variability

When developing our veterinary AI radiology tool, the Vetology team set out to understand where radiologists consistently agreed—and where their interpretations differed. Identifying conditions with high agreement rates between different radiologists guides our selection criteria for building new AI classifiers. Studying patterns of diagnostic variability helps train the models to better handle ambiguous cases.

This process isn’t static. Our models continue to evolve through regular retraining, and input from real-world clinical use. Feedback from veterinarians and our internal human case reviews play a key role in flagging areas where the AI might need more structure or refinement. It’s all part of our goal to ensure the AI aligns with expert-level thinking and delivers meaningful support.

To support our understanding of diagnostic consistency, the team asked veterinary radiologists—without any involvement from AI—to independently evaluate and diagnose images with a wide variety of canine conditions. The radiologists showed high levels of agreement with one another on conditions such as pregnancy, urinary stones, hepatomegaly, small intestinal obstruction, cardiomegaly, pericardial effusion, and esophageal enlargement. In other words, these diagnoses were more consistently interpreted across different experts.

In contrast, there was noticeably lower agreement among radiologists on conditions like pyloric gastric obstruction, right kidney size, subtle or suspicious nodules, and bronchiectasis, indicating that these findings tend to generate more varied interpretations even among experienced professionals.

How AI Compares

AI has demonstrated significant potential in enhancing diagnostic processes and reducing variability when reading radiographs. For example, the Vetology team found that the radiologist agreement rate for canine hepatomegaly was 92%, while the Vetology AI tool had an 87.29% sensitivity and a 92.34% specificity. Third-party peer reviews also demonstrate the product’s value.

Researchers at Tufts University, Cummings School of Veterinary Medicine, performed a retrospective, diagnostic case-controlled study to evaluate the performance of Vetology AI’s algorithm in the detection of pleural effusion in canine thoracic radiographs. Sixty-one dogs were included in the study, and 41 of those dogs had confirmed pleural effusion. The AI algorithm determined the presence of pleural effusion with 88.7% accuracy, 90.2% sensitivity, and 81.8% specificity.

Researchers at the Animal Medical Center in New York, New York, performed a prospective, diagnostic accuracy study to evaluate the performance of Vetology AI’s algorithm in diagnosing canine cardiogenic pulmonary edema from thoracic radiographs, using an American College of Veterinary Radiology-certified veterinary radiologist’s interpretation as the reference standard. Four hundred eighty-one cases were analyzed. The radiologist diagnosed 46 of the 481 dogs with cardiogenic pulmonary edema (CPE). The AI algorithm diagnosed 42 of the 46 cases as CPE positive and four of the 46 as CPE negative. When compared to the radiologist’s diagnosis, the AI algorithm had a 92.3% accuracy, 91.3% sensitivity, and 92.4% specificity.

AI radiology tools can never replace the expertise of board-certified veterinary radiologists, but they can serve as valuable assistants, enhancing efficiency, consistency, and diagnostic accuracy. Vetology’s AI tool is proven to be accurate and reliable to ensure standardized interpretations. While final diagnoses and treatment decisions will always remain the responsibility of an experienced professional, AI serves as a powerful support system, helping to optimize patient care and improve veterinary radiology services.

Want to see AI in action?

To learn more, contact our Vetology team or book a demo for a firsthand look at our AI and teleradiology platform.

Veterinary Radiology AI: Ensuring Accuracy, Trust, and Quality Care

Veterinary Radiology AI: Ensuring Accuracy, Trust, and Quality Care

When you take a radiograph to better understand a patient’s condition, an accurate reading of the image is paramount to ensure the animal receives the appropriate treatment. That’s why U.S. board-certified radiologists on the Vetology team worked in conjunction with the technology crew to hone our artificial intelligence (AI) models. By integrating expert oversight, rigorous testing, and quality assurance measures, AI can support diagnostic efficiency while maintaining the trust and reliability veterinarians need for patient care.

To help you better understand the Vetology AI radiology tool, this article explains how it was developed and validated, and how it is improved.

Relying on the Experts

​To develop our AI model, we used more than a million images from hundreds of thousands of cases, ensuring a comprehensive representation of anatomical variations and disease conditions. Each image was evaluated and annotated by a U.S. board-certified veterinary radiologist, providing high-quality, expert-labeled data (i.e., ground truth) that allows the AI to learn from professional interpretations.

Training the AI

To train our veterinary radiology AI tool, we used a combination of deep learning techniques, including convolutional neural networks (CNNs), confusion matrices, quality assurance (QA) regression testing, and large language models (LMMs).

Convolutional Neural Networks

CNNs are designed for image recognition and pattern detection, enabling the automated analysis of radiographs with high accuracy. The images first undergo preprocessing to ensure consistency. This includes image orientation, maximizing image clarity, and contrast adjustments. The CNN then learns to identify features and detect patterns.

  • The first convolutional layers identify edges, textures, and contrasts, distinguishing bones, organs, and soft tissues.
  • Multi-output CNNs can determine whether an X-ray belongs to a dog or cat and pinpoint the anatomical region being analyzed.
  • Once trained, a CNN can determine orientation and recognize certain abnormalities and conditions.

Confusion Matrices

A confusion matrix helps measure how well an AI model classifies radiographic images, ensuring it can correctly identify normal versus abnormal scans, specific conditions, and disease severity. It compares the AI’s predictions with the ground truth, which is determined by U.S. board-certified veterinary radiologists. The table below outlines the relationship between the four key components:

chart showing the different outcomes for a confusion matrix

When used to evaluate results, the confusion matrix describes the AI’s performance by measuring key performance metrics, including:

  • Accuracy = (TP + TN) / total cases
  • Sensitivity = TP / (TP + FN) — How well the AI detects conditions
  • Specificity = TN / (TN + FN) — How well the AI identifies normal cases
  • Precision = TP / (TP + FP) — How many positive predictions are correct
  • F1 score = The balance between precision and recall, ensuring AI does not over or under diagnose

Quality Assurance Regression Testing

QA regression testing compares AI-generated results with known labeled images to identify errors, inconsistencies, and areas for improvement. This process enables our developers to fine-tune the AI, reducing false positives and false negatives, and thus enhancing results over time.

Large Language Models

Large Language Models (LLMs) are trained to recognize and generate common veterinary diagnostic phrases, sentence structures, and condition descriptions to create professional and structured reports.

Board-certified veterinary radiologists are once again involved to review the generated phrases and confirm that the AI is accurately interpreting the images and the LLM is producing relevant and coherent statements.

AI Screening Features

AI screening features enhance veterinary radiology through efficiency tools that promote improved AI reports and consistent image interpretation. Key features include:

  • Image preprocessing and standardization: Pre-AI tools adjust orientation, brightness, and contrast for clearer analysis.
  • Automated cropping: EfficientDet SSD technology isolates the area of concern, improving contextual accuracy for AI interpretations.
  • Anomaly detection: AI identifies abnormalities, such as fractures, tumors, and changes in lung patterns, and can detect species- and region-specific changes. Severity grading models can also help classify the condition’s severity.

Keeping Updated

To ensure our AI model remains accurate and aligned with evolving veterinary radiology practices, we regularly update it with new data to integrate the latest medical findings and maintain optimal performance. The modifications undergo a structured change management process to ensure the updates improve accuracy without introducing errors, and we track all changes between AI versions to maintain transparency and traceability of updates.

Vetology’s AI radiology model is designed to support, not replace veterinary expertise, improving image analysis and providing clinicians with faster, more consistent insights. Utilizing an AI radiology tool can help veterinary teams make more informed decisions before seeking expert consultation. Veterinarians can use this tool as an initial screening step before sending cases to a teleradiologist, helping streamline workflows, prioritize urgent cases, and improve diagnostic efficiency.

Want to see AI in action?

To learn more, contact our Vetology team, or book a demo for a firsthand look at our AI and teleradiology platform.

AI in Veterinary Imaging: What to Know

AI in Veterinary Imaging: What to Know

Artificial Intelligence (AI) is a powerful assistive tool in veterinary medicine. In the world of imaging, it offers exciting possibilities for new approaches to current workflows that can impact patient outcomes, support starting treatment plans sooner, and in the best-case scenario, relieve or support decision fatigue associated with patient care. While AI’s potential is clear, it is important to recognize that its implementation may call for a shift in how veterinarians approach imaging, read radiographs, and initiate their diagnostic pathways. This article explores why AI is important, how to use it effectively, and the collaborative effort needed to integrate this technology into practice.

Why An AI Radiology Report Matters in Veterinary Medicine

The gains associated with fast, consistent patient screening results are at the core of its unique value. By analyzing images for specific patterns and abnormalities, an AI report can highlight areas of concern and support veterinarians in making confident treatment decisions. By all means, rely on your expertise in reading radiographs, but why not check your answers when you can? Asking for help is a critical skill in a successful practice. Vetology’s Virtual AI Radiology Report is just that: a support tool, an answer sheet, a guide. It’s one of many tools in your medical toolkit, and it is essential to remember that it was never intended to replace veterinary expertise nor radiologists; it was built to complement both.

Using AI in Imaging Diagnostics

In practice, AI tools analyze radiographs by running specialized classifiers tailored to detect specific conditions or abnormalities. For example, when assessing a feline abdomen radiograph, the AI might evaluate features like liver size or the presence of urocystoliths. These observations are presented as screening results, not diagnoses, guiding veterinarians toward further tests or treatments. The effectiveness of AI depends on the quality of the submitted radiographs. Clear, well-positioned lateral and VD images that focus on the area of concern lead to more accurate reports. This underscores the importance of maintaining high imaging standards in clinical workflows.

Navigating the Learning Curve Together

As with any new tool, skillset, or appliance, adopting AI in veterinary medicine involves a learning curve, some change, and maybe some practice. AI is evolving and improving.  Developing effective tools requires close collaboration between veterinary professionals and developers. Input from veterinarians helps refine systems, ensuring they address real-world clinical needs. Academic peer-reviews support the integrity of the tool, and clinicians benefit from training, practice, and patience with these tools, understanding their capabilities and limitations.

Vetology views integrating with veterinary workflows as a collective effort. Our collective success depends on thoughtful implementation, high-quality radiographs, and collaboration. By working together, veterinarians, radiologists, and technologists can create tools that reinvent workflows that support patient care and maintain the highest standards of safety. This partnership is critical to ensuring that this new approach to imaging evolves as a trusted and valuable resource for the veterinary community.

Want to see AI in action?

To tour the platform and learn more, contact our team, or book a demo for a firsthand look at our AI and teleradiology platform.

10 Best AI Veterinary Tools (March 2025)

10 Best AI Veterinary Tools (March 2025)

We’re honored to be recognized among other best-in-class AI veterinary tools by Unite.AI!

At Vetology, our mission is to support veterinarians with innovative AI and teleradiology solutions that streamline diagnostic workflows and improve patient outcomes. Being featured alongside other leading technologies in the veterinary space reinforces our dedication to advancing veterinary medicine through AI-driven insights.

We’ve highlighted the section on Vetology below, but we encourage you to check out the full article to discover other exciting, emerging tools in the veterinary and AI space. The future might be many things, but it’s definitely not boring!

The veterinary field is undergoing a transformation through AI-powered tools that enhance everything from clinical documentation to cancer treatment. These innovative platforms are not just digitizing traditional processes – they are fundamentally reshaping how veterinary professionals approach patient care, diagnostic accuracy, and practice management. In this guide, we’ll explore some of the groundbreaking AI veterinary tools that demonstrate the incredible potential of artificial intelligence in animal healthcare, from smart collars that monitor vital signs to sophisticated oncology platforms that process billions of data points.

Vetology:

Vetology functions as an advanced AI diagnostic center where machine learning systems process veterinary imaging data to provide rapid clinical insights. The platform combines sophisticated image recognition technology with teleradiology services, transforming how veterinary practices approach diagnostic imaging while maintaining high accuracy standards through AI-human collaboration.

At its core, Vetology’s AI Virtual Radiologist engine processes radiographic images through multiple analytical layers. This system simultaneously evaluates anatomical structures, detects abnormalities, and generates detailed clinical reports within minutes. The platform integrates with existing practice management systems, enabling workflow integration while maintaining continuous synchronization with clinic records.

The system’s AI extends beyond basic image analysis, incorporating specialized algorithms for automated cardiac measurements and vertebral heart scoring. This technical foundation enables the platform to process multiple imaging modalities, achieving a 92% agreement rate with board-certified radiologists through its advanced pattern recognition capabilities. The platform also maintains a sophisticated teleradiology network, creating a hybrid system that combines AI efficiency with specialist expertise for complex cases.

Key Features:

  • AI diagnostic engine with 5-minute report generation capabilities
  • Automated cardiac measurement system with vertebral heart scoring
  • Multi-modality processing framework supporting radiographs, CT, and MRI
  • Integration architecture supporting major practice management systems
  • Pattern recognition algorithms trained on extensive veterinary datasets
$

Read the Full Article on Today's Veterinary Business

This article was originally published on October 1, 2022

Pin It on Pinterest