Search Results
Found 3 results
510(k) Data Aggregation
(106 days)
Limbus AI Inc.
Limbus Contour is a software-only medical device intended for use by trained radiation oncologists, dosimetrists and physicists to derive optimal contours for input to radiation treatment planning.
Supported image modalities are Computed Tomography and Magnetic Resonance. The Limbus Contour Software assists in the following scenarios:
Operates in conjunction with radiation treatment planning systems or DICOM viewing systems to load, save, and display medical images and contours for treatment evaluation and treatment planning.
Creation, transformation, and modification of contours for applications including, but not limited to: transferring contours to radiotherapy treatment planning systems, aiding adaptive therapy and archiving contours for patient follow-up.
Localization and definition of healthy anatomical structures.
Limbus Contour is not intended for use with digital mammography.
Limbus Contour is a stand-alone software medical device. It is a single purposes cross-platform application for automatic contouring (segmentation) of CT/MRI DICOM images via pre-trained and expert curated machine learning models. The software is intended to be used by trained medical professionals to derive contours for input to radiation treatment planning. The Limbus Contour software segments normal tissues using machine learning models and further post-processing on machine learning model prediction outputs. Limbus Contour does not display or store DICOM images and relies on existing radiotherapy treatment planning systems (TPS) and DICOM image viewers for display and modification of generated segmentations. Limbus Contour interfaces with the user's operating system (importing DICOM image .dcm files and exporting segmented DICOM RT-Structure Set .dcm files).
Here's a summary of the acceptance criteria and the study details for the Limbus Contour device, based on the provided FDA 510(k) submission information:
1. Acceptance Criteria and Reported Device Performance
The acceptance criterion for each contoured structure is that the Limbus DSC (Dice Similarity Coefficient) lower 95% confidence edge must be greater than or equal to the "Test DSC Threshold," which is derived from the mean minus the standard deviation of reference model DSCs from published machine learning autosegmentation models.
Structure | Limbus Mean DSC | Limbus DSC Std Dev | Number of Scans | Limbus DSC lower 95% conf edge | Test DSC Threshold | Result |
---|---|---|---|---|---|---|
A_Aorta | 0.909095 | 0.0455771 | 10 | 0.87649337 | 0.81 | Passed |
A_Aorta_Base | 0.979588 | 0.0286193 | 10 | 0.95911641 | 0.81 | Passed |
A_Aorta_I | 0.938016 | 0.10304303 | 10 | 0.86430858 | 0.81 | Passed |
A_Celiac | 0.781502 | 0.27272084 | 10 | 0.58642282 | 0.26 | Passed |
A_LAD | 0.692766 | 0.06590144 | 10 | 0.64562622 | 0.26 | Passed |
A_Mesenteric_S | 0.857257 | 0.14185425 | 10 | 0.75578763 | 0.26 | Passed |
A_Pulmonary | 0.901867 | 0.03499015 | 10 | 0.87683829 | 0.85 | Passed |
Applicator_Cylinder (beta) | 0.80111733 | 0.33037573 | 15 | 0.60816274 | 0.374 | Passed |
Applicator_Ring (beta) | 0.963863 | 0.07306595 | 10 | 0.9115984 | 0.374 | Passed |
Atrium_L | 0.977044 | 0.0180652 | 10 | 0.96412183 | 0.79 | Passed |
Atrium_R | 0.978451 | 0.01852677 | 10 | 0.96519867 | 0.78 | Passed |
Bladder | 0.96601238 | 0.05220935 | 21 | 0.94024138 | 0.935 | Passed |
Bladder (MRI) | 0.963518 | 0.01177413 | 10 | 0.95509588 | 0.88 | Passed |
Bladder_CBCT | 0.959173 | 0.04229406 | 10 | 0.92891975 | 0.91 | Passed |
Bladder_HDR | 0.931167 | 0.06094679 | 10 | 0.88757132 | 0.56674243 | Passed |
Bladder_HDR (MRI) | 0.896883 | 0.13171833 | 10 | 0.80266393 | 0.79 | Passed |
Bone Marrow_Pelvic | 0.995414 | 0.00407252 | 10 | 0.9925009 | 0.805 | Passed |
Bone_Hyoid | 0.85411417 | 0.04163051 | 12 | 0.82693015 | 0.77 | Passed |
Bone_Illium_L | 0.9888075 | 0.01103973 | 12 | 0.98159874 | 0.76 | Passed |
Bone_Illium_R | 0.99058833 | 0.00575056 | 12 | 0.98683332 | 0.76 | Passed |
Bone_Ischium_L | 0.938985 | 0.01573502 | 10 | 0.92772963 | 0.76 | Passed |
Bone_Ischium_R | 0.93923 | 0.01613541 | 10 | 0.92768822 | 0.76 | Passed |
Bone_Mandible | 0.94024769 | 0.01266685 | 13 | 0.93230094 | 0.922 | Passed |
Bone_Pelvic | 0.98383 | 0.00511637 | 10 | 0.98017022 | 0.929 | Passed |
Bowel | 0.90743217 | 0.06592406 | 23 | 0.87633846 | 0.74 | Passed |
Bowel_Bag | 0.93979478 | 0.03659061 | 23 | 0.92253647 | 0.752 | Passed |
Bowel_Bag_Extend | 0.971576 | 0.01582803 | 20 | 0.9635702 | 0.752 | Passed |
Bowel_Bag_Full | 0.9679985 | 0.01354157 | 20 | 0.9611492 | 0.752 | Passed |
Bowel_Bag_Superior | 0.93686455 | 0.09357214 | 11 | 0.8730466 | 0.752 | Passed |
Bowel_Extend | 0.941682 | 0.03040818 | 20 | 0.92630159 | 0.74 | Passed |
Bowel_Full | 0.92351381 | 0.03444493 | 21 | 0.90651148 | 0.74 | Passed |
Bowel_HDR | 0.841368 | 0.05558462 | 10 | 0.80160792 | 0.2008343 | Passed |
Bowel_HDR (MRI) | 0.55831818 | 0.24969816 | 11 | 0.38801937 | 0.31 | Passed |
Bowel_Superior | 0.90214273 | 0.03756911 | 11 | 0.87651989 | 0.74 | Passed |
BrachialPlex_L | 0.691605 | 0.10786794 | 10 | 0.61444628 | 0.39 | Passed |
BrachialPlex_R | 0.693809 | 0.11005989 | 10 | 0.61508237 | 0.39 | Passed |
Brain | 0.992205 | 0.00251205 | 16 | 0.99078444 | 0.988 | Passed |
Brainstem | 0.90334688 | 0.03859191 | 16 | 0.88152315 | 0.695 | Passed |
Brainstem (MRI) | 0.925526 | 0.02877815 | 10 | 0.90494078 | 0.725 | Passed |
Breast_Implant_L | 0.992884 | 0.00662727 | 10 | 0.98814346 | 0.865 | Passed |
Breast_Implant_R | 0.973663 | 0.03491225 | 10 | 0.94869001 | 0.865 | Passed |
Breast_L | 0.954514 | 0.02763163 | 10 | 0.9347489 | 0.726 | Passed |
Breast_R | 0.93952091 | 0.04383671 | 11 | 0.90962345 | 0.7345 | Passed |
Bronchus | 0.839515 | 0.06515951 | 10 | 0.79290593 | 0.76 | Passed |
CW2cm_L | 0.998955 | 0.00118886 | 10 | 0.9981046 | 0.72 | Passed |
CW2cm_R | 0.999376 | 0.00101477 | 10 | 0.99865013 | 0.72 | Passed |
Canal_Anal | 0.87596095 | 0.13633659 | 21 | 0.808664 | 0.803 | Passed |
Canal_Anal_HDR | 0.942891 | 0.04773688 | 10 | 0.90874446 | 0.56167132 | Passed |
Canal_Anal_HDR (MRI) | 0.610295 | 0.35031087 | 10 | 0.35971511 | 0.31 | Passed |
Carina | 1 | 0 | 10 | 1 | 0.77 | Passed |
CaudaEquina | 0.882098 | 0.06633305 | 10 | 0.83464949 | 0.722 | Passed |
Cavity_Oral | 0.913113 | 0.0386665 | 10 | 0.88545458 | 0.8 | Passed |
Cerebellum | 0.983219 | 0.01399611 | 10 | 0.97320748 | 0.83 | Passed |
Chestwall_L | 0.95907091 | 0.00299448 | 11 | 0.95702862 | 0.72 | Passed |
Chestwall_R | 0.95957182 | 0.00327572 | 11 | 0.95733772 | 0.72 | Passed |
Clavicle_L | 0.98014375 | 0.01256694 | 16 | 0.97303715 | 0.93 | Passed |
Clavicle_R | 0.981565 | 0.01013648 | 16 | 0.97583282 | 0.93 | Passed |
Cochlea_L | 0.702311 | 0.10183115 | 10 | 0.62947045 | 0.533 | Passed |
Cochlea_R | 0.686758 | 0.14712802 | 10 | 0.58151627 | 0.545 | Passed |
Colon_Sigmoid | 0.81625381 | 0.15924956 | 21 | 0.73764681 | 0.704 | Passed |
Colon_Sigmoid_HDR | 0.865505 | 0.12156688 | 10 | 0.77854734 | 0.30928644 | Passed |
Colon_Sigmoid_HDR (MRI) | 0.753036 | 0.15966944 | 10 | 0.6388233 | 0.47 | Passed |
Cornea_L | 0.96183182 | 0.06990272 | 11 | 0.91415686 | 0.489 | Passed |
Cornea_L (MRI) | 0.913718 | 0.03513108 | 10 | 0.88858848 | 0.489 | Passed |
Cornea_R | 0.96934727 | 0.05299966 | 11 | 0.93320051 | 0.498 | Passed |
Cornea_R (MRI) | 0.927223 | 0.02302511 | 10 | 0.91075297 | 0.498 | Passed |
Duodenum | 0.828433 | 0.18461132 | 10 | 0.69637919 | 0.649 | Passed |
ESTRO_LN_Ax_IP_L | 0.984552 | 0.0225043 | 10 | 0.96845451 | 0.79 | Passed |
ESTRO_LN_Ax_IP_R | 0.988801 | 0.01830441 | 10 | 0.97570773 | 0.796 | Passed |
ESTRO_LN_Ax_L1_L | 0.997122 | 0.00681545 | 10 | 0.99224686 | 0.66 | Passed |
ESTRO_LN_Ax_L1_R | 0.967503 | 0.01693458 | 10 | 0.95538957 | 0.66 | Passed |
ESTRO_LN_Ax_L2+IP_Fill_L | 0.992986 | 0.01314147 | 10 | 0.98358582 | 0.73 | Passed |
ESTRO_LN_Ax_L2+IP_Fill_R | 0.994206 | 0.01093726 | 10 | 0.9863825 | 0.73 | Passed |
ESTRO_LN_Ax_L2_L | 0.995192 | 0.01077199 | 10 | 0.98748672 | 0.73 | Passed |
ESTRO_LN_Ax_L2_R | 0.997352 | 0.00448639 | 10 | 0.99414286 | 0.73 | Passed |
ESTRO_LN_Ax_L3_L | 0.993382 | 0.00884358 | 10 | 0.98705612 | 0.51 | Passed |
ESTRO_LN_Ax_L3_R | 0.992149 | 0.01468854 | 10 | 0.98164218 | 0.51 | Passed |
ESTRO_LN_IMN_L | 0.980597 | 0.02745317 | 10 | 0.96095955 | 0.39 | Passed |
ESTRO_LN_IMN_L_Expand | 0.982079 | 0.05662552 | 10 | 0.94157436 | 0.39 | Passed |
ESTRO_LN_IMN_R | 0.974402 | 0.04214952 | 10 | 0.94425214 | 0.39 | Passed |
ESTRO_LN_IMN_R_Expand | 0.977852 | 0.06968747 | 10 | 0.92800405 | 0.39 | Passed |
ESTRO_LN_Sclav_L | 0.97586 | 0.03136266 | 10 | 0.95342606 | 0.7 | Passed |
ESTRO_LN_Sclav_R | 0.98735 | 0.02268151 | 10 | 0.97112575 | 0.7 | Passed |
Esophagus | 0.83741083 | 0.02651585 | 12 | 0.82009643 | 0.67 | Passed |
Eye_L | 0.93511824 | 0.03324782 | 17 | 0.91687796 | 0.894 | Passed |
Eye_L (MRI) | 0.950337 | 0.01463563 | 10 | 0.93986803 | 0.847 | Passed |
Eye_R | 0.94191706 | 0.03257919 | 17 | 0.92404361 | 0.902 | Passed |
Eye_R (MRI) | 0.939666 | 0.03581356 | 10 | 0.9140483 | 0.849 | Passed |
Femur_Head_L | 0.961299 | 0.00888921 | 10 | 0.95494049 | 0.93 | Passed |
Femur_Head_L (MRI) | 0.938162 | 0.04781144 | 10 | 0.90396214 | 0.77 | Passed |
Femur_Head_L_CBCT | 0.977939 | 0.01378171 | 10 | 0.96808085 | 0.88 | Passed |
Femur_Head_R | 0.961381 | 0.01105991 | 10 | 0.95346976 | 0.937 | Passed |
Femur_Head_R (MRI) | 0.948586 | 0.02852155 | 10 | 0.92818433 | 0.77 | Passed |
Femur_Head_R_CBCT | 0.989667 | 0.01081208 | 10 | 0.98193304 | 0.88 | Passed |
Gallbladder | 0.946422 | 0.05882969 | 10 | 0.9043407 | 0.809 | Passed |
GInd Lacrimal L | 0.76574538 | 0.0785035 | 13 | 0.71649497 | 0.489 | Passed |
Glnd_Lacrimal_R | 0.73474077 | 0.09508335 | 13 | 0.67508872 | 0.498 | Passed |
Glnd_Submand_L | 0.838183 | 0.08845188 | 10 | 0.77491273 | 0.725 | Passed |
Glnd_Submand_R | 0.882672 | 0.0245712 | 10 | 0.86509605 | 0.595 | Passed |
Glnd_Thyroid | 0.840575 | 0.03434333 | 10 | 0.81600897 | 0.716 | Passed |
GreatVes | 0.956281 | 0.01660489 | 10 | 0.9444034 | 0.81 | Passed |
Heart | 0.95488833 | 0.02805647 | 12 | 0.93656793 | 0.89 | Passed |
Heart+A_Pulm | 0.995663 | 0.01079707 | 10 | 0.98793977 | 0.89 | Passed |
Hippocampus_L | 0.897474 | 0.14363431 | 10 | 0.79473135 | 0.45 | Passed |
Hippocampus_L (MRI) | 0.801092 | 0.07687695 | 10 | 0.74610136 | 0.618 | Passed |
Hippocampus_R | 0.841933 | 0.23470004 | 10 | 0.67405037 | 0.45 | Passed |
Hippocampus_R (MRI) | 0.804229 | 0.07348396 | 10 | 0.75166539 | 0.618 | Passed |
Humerus_L | 0.981592 | 0.03366976 | 10 | 0.95750778 | 0.93 | Passed |
Humerus_R | 0.983804 | 0.02794829 | 10 | 0.96381239 | 0.93 | Passed |
InternalAuditoryCanal_L | 0.719663 | 0.27119782 | 10 | 0.52567325 | 0.41 | Passed |
InternalAuditoryCanal_R | 0.778302 | 0.29907667 | 10 | 0.5643703 | 0.41 | Passed |
Kidney_L | 0.97211 | 0.0055787 | 10 | 0.96811951 | 0.83 | Passed |
Kidney_R | 0.971235 | 0.00508737 | 10 | 0.96759597 | 0.85 | Passed |
LN_Ax_L1_L | 0.93347 | 0.03827463 | 10 | 0.90609188 | 0.66 | Passed |
LN_Ax_L1_R | 0.957366 | 0.01855924 | 10 | 0.94409044 | 0.66 | Passed |
LN_Ax_L2_L | 0.797847 | 0.03448156 | 10 | 0.77318209 | 0.73 | Passed |
LN_Ax_L2_R | 0.836689 | 0.03793359 | 10 | 0.80955483 | 0.73 | Passed |
LN_Ax_L3_L | 0.841469 | 0.02407574 | 10 | 0.82424745 | 0.51 | Passed |
LN_Ax_L3_R | 0.833202 | 0.05413932 | 10 | 0.79447576 | 0.51 | Passed |
LN_Ax_Sclav_L | 0.854859 | 0.07708553 | 10 | 0.79971917 | 0.66 | Passed |
LN_Ax_Sclav_R | 0.839354 | 0.0636715 | 10 | 0.79380932 | 0.66 | Passed |
LN_IMN_L | 0.681072 | 0.05716488 | 10 | 0.64018155 | 0.39 | Passed |
LN_IMN_L_Expand | 0.974158 | 0.08171958 | 10 | 0.9157034 | 0.39 | Passed |
LN_IMN_R | 0.754624 | 0.0588019 | 10 | 0.71256258 | 0.39 | Passed |
LN_IMN_R_Expand | 0.969235 | 0.09728747 | 10 | 0.89964457 | 0.39 | Passed |
LN_Inguinal_L | 0.987752 | 0.01196273 | 10 | 0.97919497 | 0.779 | Passed |
LN_Inguinal_R | 0.975856 | 0.01828094 | 10 | 0.96277951 | 0.779 | Passed |
LN_Neck_IA | 0.88038818 | 0.10469436 | 11 | 0.80898467 | 0.41 | Passed |
LN_Neck_IA6 | 0.94597364 | 0.03537206 | 11 | 0.92184923 | 0.896 | Passed |
LN_Neck_IB_L | 0.918553 | 0.02691603 | 10 | 0.89929977 | 0.896 | Passed |
LN_Neck_IB_R | 0.916248 | 0.01954066 | 10 | 0.90227042 | 0.896 | Passed |
LN_Neck_III_L | 0.924377 | 0.02716647 | 10 | 0.90494463 | 0.752 | Passed |
LN_Neck_III_R | 0.903805 | 0.03651978 | 10 | 0.87768214 | 0.775 | Passed |
LN_Neck_II_L | 0.921425 | 0.02096226 | 10 | 0.90643054 | 0.894 | Passed |
LN_Neck_II_R | 0.919918 | 0.02031001 | 10 | 0.9053901 | 0.894 | Passed |
LN_Neck_IV_L | 0.837067 | 0.10669372 | 10 | 0.76074821 | 0.655 | Passed |
LN_Neck_IV_R | 0.813474 | 0.07643769 | 10 | 0.75879757 | 0.655 | Passed |
LN_Neck_L | 0.86875 | 0.04264226 | 12 | 0.84090532 | 0.779 | Passed |
LN_Neck_R | 0.86855 | 0.04499896 | 12 | 0.83916643 | 0.779 | Passed |
LN_Neck_VI | 0.93822083 | 0.07273804 | 12 | 0.89072412 | 0.722 | Passed |
LN_Neck_VIIAB_L | 0.704562 | 0.14161814 | 10 | 0.60326153 | 0.55 | Passed |
LN_Neck_VIIAB_R | 0.684087 | 0.15673354 | 10 | 0.57197437 | 0.55 | Passed |
LN_Neck_VIIA_L | 0.973697 | 0.03639132 | 10 | 0.94766603 | 0.54 | Passed |
LN_Neck_VIIA_R | 0.963045 | 0.0518865 | 10 | 0.92593022 | 0.54 | Passed |
LN_Neck_VIIB_L | 0.979443 | 0.02540021 | 10 | 0.96127405 | 0.69 | Passed |
LN_Neck_VIIB_R | 0.9727 | 0.02326651 | 10 | 0.9560573 | 0.71 | Passed |
LN_Neck_V_L | 0.899668 | 0.05719485 | 10 | 0.85875611 | 0.785 | Passed |
LN_Neck_V_R | 0.855186 | 0.05671539 | 10 | 0.81461707 | 0.775 | Passed |
LN_Pelvics | 0.90169318 | 0.05091482 | 22 | 0.877139 | 0.779 | Passed |
LN_Pelvics_CBCT | 0.974742 | 0.04202674 | 10 | 0.94467997 | 0.58 | Passed |
LN_Sclav_L | 0.96093 | 0.05712461 | 10 | 0.92006835 | 0.7 | Passed |
LN_Sclav_R | 0.958498 | 0.02948648 | 10 | 0.93740611 | 0.7 | Passed |
Larynx | 0.898777 | 0.05827018 | 10 | 0.85709592 | 0.77 | Passed |
Lens_L | 0.78292471 | 0.08119035 | 17 | 0.73838241 | 0.616 | Passed |
Lens_R | 0.76047471 | 0.07902615 | 17 | 0.71711973 | 0.449 | Passed |
Lips | 0.824696 | 0.14948194 | 10 | 0.71777049 | 0.68 | Passed |
Liver | 0.97773385 | 0.01147248 | 13 | 0.9705364 | 0.92 | Passed |
Lobe_Temporal_L | 0.944744 | 0.07569022 | 10 | 0.89060224 | 0.83 | Passed |
Lobe_Temporal_R | 0.948456 | 0.06837365 | 10 | 0.89954784 | 0.83 | Passed |
Lung_L | 0.983115 | 0.00654768 | 10 | 0.9784314 | 0.96 | Passed |
Lung_R | 0.983649 | 0.00652109 | 10 | 0.97898441 | 0.96 | Passed |
Mesorectum | 0.827965 | 0.05209883 | 10 | 0.79069833 | 0.779 | Passed |
Musc_Constrict | 0.869097 | 0.05737849 | 10 | 0.82805376 | 0.61 | Passed |
Musc_PecMinor_L | 0.869259 | 0.04744788 | 10 | 0.83531919 | 0.79 | Passed |
Musc_PecMinor_R | 0.863584 | 0.06177418 | 10 | 0.81939649 | 0.796 | Passed |
Musc_Sclmast_L | 0.946117 | 0.02773018 | 10 | 0.9262814 | 0.803 | Passed |
Musc_Sclmast_R | 0.945291 | 0.03302699 | 10 | 0.92166656 | 0.803 | Passed |
OpticChiasm | 0.65929882 | 0.1679447 | 17 | 0.56716174 | 0.41 | Passed |
OpticNrv_L | 0.82576941 | 0.06203798 | 17 | 0.79173441 | 0.73 | Passed |
OpticNrv_R | 0.82894294 | 0.06130553 | 17 | 0.79530977 | 0.72 | Passed |
Optics (MRI) | 0.764846 | 0.05410538 | 10 | 0.72614403 | 0.504 | Passed |
Pancreas | 0.884343 | 0.09900704 | 10 | 0.81352255 | 0.769 | Passed |
Parotid_L | 0.88352083 | 0.06794505 | 12 | 0.83915386 | 0.778 | Passed |
Parotid_R | 0.88281667 | 0.05035732 | 12 | 0.84993419 | 0.803 | Passed |
PelvisVessels | 0.914998 | 0.02637213 | 10 | 0.89613382 | 0.26 | Passed |
PenileBulb | 0.84850818 | 0.04605243 | 11 | 0.81709956 | 0.705 | Passed |
PenileBulb (MRI) | 0.73231 | 0.27283179 | 10 | 0.53715145 | 0.46 | Passed |
Pericardium | 0.984828 | 0.0185493 | 10 | 0.97155955 | 0.8688 | Passed |
Pericardium+A_Pulm | 0.994973 | 0.01235765 | 10 | 0.98613348 | 0.89 | Passed |
Pituitary | 0.75041867 | 0.15158537 | 15 | 0.66188586 | 0.41 | Passed |
Prostate | 0.934093 | 0.02193268 | 10 | 0.91840439 | 0.88 | Passed |
Prostate (MRI) | 0.915164 | 0.03096645 | 10 | 0.89301348 | 0.8 | Passed |
ProstateBed | 0.74691333 | 0.1454049 | 15 | 0.6619902 | 0.5 | Passed |
ProstateFiducials (beta) | 0.61422 | 0.24931989 | 10 | 0.43587968 | 0.41 | Passed |
Prostate_CBCT | 0.961269 | 0.04241179 | 10 | 0.93093154 | 0.79 | Passed |
PubicSymphys | 0.943743 | 0.02100908 | 10 | 0.92871505 | 0.76 | Passed |
PubicSymphys (MRI) | 0.779585 | 0.11210947 | 10 | 0.69939229 | 0.54 | Passed |
Rectum | 0.88681762 | 0.08654191 | 21 | 0.84409976 | 0.803 | Passed |
Rectum (MRI) | 0.934619 | 0.02030278 | 10 | 0.92009628 | 0.77 | Passed |
Rectum_CBCT | 0.963103 | 0.02896341 | 10 | 0.94238526 | 0.87 | Passed |
Rectum_HDR | 0.918553 | 0.09535355 | 10 | 0.85034592 | 0.56167132 | Passed |
Rectum_HDR (MRI) | 0.781 | 0.15188698 | 10 | 0.67235415 | 0.58 | Passed |
Retina_L | 0.90761364 | 0.1929304 | 11 | 0.7760315 | 0.489 | Passed |
Retina_L (MRI) | 0.953271 | 0.04079533 | 10 | 0.92408981 | 0.489 | Passed |
Retina_R | 0.91191636 | 0.1905022 | 11 | 0.78199031 | 0.498 | Passed |
Retina_R (MRI) | 0.92854 | 0.05520214 | 10 | 0.88905351 | 0.498 | Passed |
Ribs_L | 0.94473545 | 0.00563264 | 11 | 0.94089389 | 0.81 | Passed |
Ribs_R | 0.94621636 | 0.00495204 | 11 | 0.94283898 | 0.81 | Passed |
Sacrum | 0.97012438 | 0.01642714 | 16 | 0.96083483 | 0.82 | Passed |
Sacrum (MRI) | 0.966632 | 0.04786265 | 10 | 0.9323955 | 0.77 | Passed |
SeminalVes | 0.82148 | 0.16089309 | 10 | 0.70639202 | 0.5 | Passed |
SeminalVes (MRI) | 0.833995 | 0.05384245 | 10 | 0.79548111 | 0.39 | Passed |
SeminalVes_CBCT | 0.904653 | 0.05295257 | 10 | 0.86677565 | 0.621 | Passed |
SpaceOARVue (beta) | 0.866934 | 0.0421535 | 10 | 0.8367813 | 0.5 | Passed |
SpinalCanal | 0.8971765 | 0.06232767 | 20 | 0.86565125 | 0.722 | Passed |
SpinalCord | 0.87788679 | 0.06353613 | 28 | 0.8507265 | 0.722 | Passed |
Spleen | 0.98238429 | 0.00724712 | 14 | 0.97800307 | 0.958 | Passed |
Sternum | 0.968506 | 0.00927831 | 10 | 0.96186916 | 0.8 | Passed |
Stomach | 0.92353818 | 0.04262548 | 11 | 0.89446681 | 0.64 | Passed |
Trachea | 0.900195 | 0.04891354 | 10 | 0.86520679 | 0.77 | Passed |
Urethra_HDR | 0.68898 | 0.26984128 | 10 | 0.49596059 | 0.26 | Passed |
Urethra_HDR (MRI) | 0.558433 | 0.30664371 | 10 | 0.33908855 | 0.26 | Passed |
Uterus+Cervix | 0.923876 | 0.07561527 | 10 | 0.86978785 | 0.8525 | Passed |
VB_C1 | 0.871119 | 0.09654266 | 10 | 0.80206134 | 0.389 | Passed |
VB_C2 | 0.890465 | 0.08375338 | 10 | 0.83055561 | 0.389 | Passed |
VB_C3 | 0.882984 | 0.07768781 | 10 | 0.82741335 | 0.389 | Passed |
VB_C4 | 0.847607 | 0.13488571 | 10 | 0.75112228 | 0.389 | Passed |
VB_C5 | 0.735888 | 0.25082081 | 10 | 0.55647407 | 0.389 | Passed |
VB_C6 | 0.662313 | 0.36427326 | 10 | 0.40174571 | 0.389 | Passed |
VB_C7 | 0.686772 | 0.36426675 | 10 | 0.42620937 | 0.389 | Passed |
VB_L1 | 0.7397 | 0.36842487 | 12 | 0.49912477 | 0.389 | Passed |
VB_L2 | 0.85353818 | 0.3119903 | 11 | 0.64075498 | 0.389 | Passed |
VB_L3 | 0.89139273 | 0.29651247 | 11 | 0.68916569 | 0.389 | Passed |
VB_L4 | 0.88930818 | 0.29491517 | 11 | 0.68817053 | 0.389 | Passed |
VB_L5 | 0.971761 | 0.01978041 | 10 | 0.95761193 | 0.389 | Passed |
VB_T01 | 0.749499 | 0.20357809 | 10 | 0.60387813 | 0.389 | Passed |
VB_T02 | 0.900861 | 0.10665309 | 10 | 0.82457127 | 0.389 | Passed |
VB_T03 | 0.846845 | 0.23384828 | 10 | 0.67957164 | 0.389 | Passed |
VB_T04 | 0.871065 | 0.16033768 | 10 | 0.7563743 | 0.389 | Passed |
VB_T05 | 0.868184 | 0.10527773 | 10 | 0.79287808 | 0.389 | Passed |
VB_T06 | 0.856586 | 0.1735606 | 10 | 0.73243685 | 0.389 | Passed |
VB_T07 | 0.895207 | 0.09111821 | 10 | 0.83002949 | 0.389 | Passed |
VB_T08 | 0.90946273 | 0.09927246 | 11 | 0.84175706 | 0.389 | Passed |
VB_T09 | 0.895233 | 0.19379417 | 10 | 0.75661063 | 0.389 | Passed |
VB_T10 | 0.85180692 | 0.25673522 | 13 | 0.69073999 | 0.389 | Passed |
VB_T11 | 0.90543538 | 0.17253913 | 13 | 0.79719022 | 0.389 | Passed |
VB_T12 | 0.73466077 | 0.38432697 | 13 | 0.49354712 | 0.389 | Passed |
VBs | 0.984448 | 0.01042268 | 10 | 0.97699258 | 0.579 | Passed |
V_Venacava_l | 0.95366786 | 0.05427303 | 14 | 0.92085737 | 0.72 | Passed |
V_Venacava_S | 0.851219 | 0.0503676 | 10 | 0.81519069 | 0.8 | Passed |
Vagina | 0.897341 | 0.06081143 | 10 | 0.85384215 | 0.665 | Passed |
Ventricle_L | 0.951144 | 0.00689877 | 10 | 0.94620926 | 0.9 | Passed |
Ventricle_R | 0.980784 | 0.01685223 | 10 | 0.96872948 | 0.8 | Passed |
Wire_Breast_L (beta) | 0.750245 | 0.28619472 | 10 | 0.54552785 | 0.39 | Passed |
Wire_Breast_R (beta) | 0.896053 | 0.15011288 | 10 | 0.78867618 | 0.39 | Passed |
All listed structures met their respective acceptance criteria by having their Limbus DSC lower 95% confidence edge exceed or meet the specified Test DSC Thresholds.
2. Sample Sizes Used for the Test Set and Data Provenance
- Test Set Sample Size: For each structure, a set of at least 10 patient scans was used for initial performance testing. Some structures had larger test sets, as indicated in the table (e.g., Bladder with 21 scans, Bowel with 23 scans, SpinalCord with 28 scans).
- Data Provenance: The test scans were randomly selected from a total pool of patient scans that contained the relevant structure. This pool was selected to reflect the general population of patients receiving radiation treatments. The data provenance details are further described in the "Training and Validation Datasets" section for the training data, implying a similar origin for the test data (multiple clinical sites and countries).
3. Number of Experts Used to Establish the Ground Truth for the Test Set and Their Qualifications
- Number of Experts: The ground truth contours were from "multiple experts at multiple institutions." The exact number is not explicitly stated.
- Qualifications of Experts: The ground truth contours were all reviewed by a "board certified radiation oncologist" to ensure consistency with established standards and guidelines for contouring and proper labeling.
4. Adjudication Method for the Test Set
The document does not explicitly describe a specific adjudication method like "2+1" or "3+1" for the ground truth contours in the test set. It states that the ground truth contours were from "multiple experts at multiple institutions" and reviewed by "a board certified radiation oncologist." This implies a form of consensus or expert review process, though the specific protocol for resolving discrepancies (if any) is not detailed.
5. If a Multi-Reader Multi-Case (MRMC) Comparative Effectiveness Study Was Done
No, a multi-reader multi-case (MRMC) comparative effectiveness study was not explicitly mentioned as being performed for this submission. The performance data is focused on the standalone algorithm's accuracy (Dice Similarity Coefficient) against expert-generated ground truth.
6. If a Standalone (i.e., algorithm only without human-in-the-loop performance) Was Done
Yes, a standalone performance study was done. The "Automatic Contouring - Validation Test" as described is a benchtop performance test where the software's outputs are compared to expert-generated ground truth without human intervention in the contouring process of the device itself.
7. The Type of Ground Truth Used
The ground truth used was expert consensus contours. These were human-generated contours reviewed by a board-certified radiation oncologist to ensure they conformed to clinical trial guidelines and established standards.
8. The Sample Size for the Training Set
The total number of unique scans included in training datasets exceeds 10,000 scans. The table provided in the document details the number of training and validation scans for each individual structure model, with training scan counts ranging from tens to over a thousand for each structure.
9. How the Ground Truth for the Training Set Was Established
The ground truth for the training set was established through:
- Human-generated contours from a variety of anonymized and pseudo-anonymized datasets.
- These datasets were collected from publicly available clinical trials and the company's clinical and research partners.
- The training dataset ground truth contours were reviewed and edited by in-house clinicians and radiation oncologists to ensure consistency with established standards and guidelines for contouring (e.g., RTOG 1106, RTOG 0848, EMBRACE II, DAHANCA, NRG, ESTRO, ACROP, EPTN).
- To minimize bias, training data included scans from multiple clinical sites, countries (United States, Canada, United Kingdom, France, Germany, Italy, Netherlands, Switzerland, Australia, New Zealand, Singapore), and different makes/models of imaging devices (GE, Siemens, Phillips, Toshiba, Elekta).
- The scans and ground truth contours were from the general patient population receiving radiotherapy, with no restrictions based on age, ethnicity, race, gender, or disease states.
Ask a specific question about this device
(37 days)
Limbus AI Inc.
Limbus Contour is a software-only medical device intended for use by trained radiation oncologists, dosimetrists and physicists to derive optimal contours for input to radiation treatment planning. Supported image modalities are Computed Tomography and Magnetic Resonance. The Limbus Contour Software assists in the following scenarios: Operates in conjunction with radiation treatment planning systems to load, save, and display medical images and contours for treatment evaluation and treatment planning. Creation, transformation, and modification of contours for applications including, but not limited to: transferring contours to radiotherapy treatment planning systems, aiding adaptive therapy and archiving contours for patient follow-up. Localization and definition of healthy anatomical structures. Limbus Contour is not intended for use with digital mammography.
Limbus Contour is a stand-alone software medical device. It is a single purposes cross-platform application for automatic contouring (segmentation) of CT/MRI DICOM images via pre-trained and expert curated machine learning models. The software is intended to be used by trained medical professionals to derive contours for input to radiation treatment planning. The Limbus Contour software segments normal tissues using machine learning models and further postprocessing on machine learning model prediction outputs. Limbus Contour does not display or store DICOM images and relies on existing radiotherapy treatment planning systems (TPS) and DICOM image viewers for display and modification of generated segmentations. Limbus Contour interfaces with the user's operating system file system (importing DICOM image .dcm files and exporting segmented DICOM RT-Structure Set .dcm files).
The provided text describes the 510(k) submission for Limbus Contour, a software-only medical device for automatic contouring (segmentation) of CT/MRI DICOM images for radiation treatment planning. While the document outlines the device's indications for use, comparison to a predicate device, and general statements about software verification and validation, it does not provide specific acceptance criteria or detailed results of a study proving the device meets those criteria with numerical performance data.
The document states: "Validation testing of the following functions of the Limbus Contour application demonstrated that the software meets user needs and intended uses and to support substantial equivalence: Automatic Contouring Validation Test." However, it does not provide the specifics of this "Validation Test," such as pre-defined acceptance metrics (e.g., Dice Score thresholds, contour distance metrics), the performance achieved, or the methodology of the study.
Therefore, I cannot fulfill your request for:
- A table of acceptance criteria and the reported device performance: This information is not present in the provided document.
- Sample sized used for the test set and the data provenance: Not specified.
- Number of experts used to establish the ground truth for the test set and the qualifications of those experts: Not specified beyond "expert curated machine learning models" in the device description.
- Adjudication method: Not specified.
- MRMC comparative effectiveness study: The document states "Clinical testing was not required to demonstrate the safety and effectiveness of Limbus Contour. Instead, substantial equivalence is based upon benchtop performance testing." This implies no MRMC study was performed.
- Standalone (algorithm only) performance: While the device is "software-only," specific numerical performance metrics for its standalone contouring accuracy are not provided.
- Type of ground truth used: Described as "pre-trained and expert curated machine learning models," but details on the ground truth for the specific validation test are missing.
- Sample size for the training set: Not specified.
- How the ground truth for the training set was established: Described as "pre-trained and expert curated machine learning models," but no specific methodology detail is given.
Conclusion based on the provided text:
The submission document only broadly states that "Validation testing" was performed and "demonstrated that the software meets user needs and intended uses." It lacks the detailed quantitative performance data and study methodologies typically found in a comprehensive clinical or performance study report. This kind of detail is usually provided in a separate report that would be referenced or included as an appendix in a full 510(k) submission, but it is not contained within this public summary letter.
Ask a specific question about this device
(98 days)
Limbus AI Inc.
Limbus Contour is a software-only medical device intended for use by trained radiation oncologists, dosimetrists and physicists to derive optimal contours for input to radiation treatment planning. Supported image modalities are Computed Tomography and Magnetic Resonance. The Limbus Contour Software assists in the following scenarios:
· Operates in conjunction with radiation treatment planning systems or DICOM viewing systems to load, save, and display medical images and contours for treatment evaluation and treatment planning.
· Creation, transformation, and modification of contours for applications including, but not limited to: transferring contours to radiotherapy treatment planning systems, aiding adaptive therapy and archiving contours for patient follow-up.
· Localization and definition of healthy anatomical structures.
Limbus Contour is not intended for use with digital mammography.
Limbus Contour is not intended to automatically contour tumor clinical target volumes.
Limbus Contour is a stand-alone software medical device. It is a single purposes cross-platform application for automatic contouring (segmentation) of CT/MRI DICOM images via pre-trained and expert curated machine learning models. The software is intended to be used by trained medical professionals to derive contours for input to radiation treatment planning. The Limbus Contour software segments normal tissues using models and further post-processing on machine learning model prediction outputs. Limbus Contour does not display or store DICOM images and relies on existing radiotherapy treatment planning systems (TPS) and DICOM image viewers for display and modification of generated segmentations. Limbus Contour interfaces with the user's operating system file system (importing DICOM image .dcm files and exporting segmented DICOM RT-Structure Set .dcm files).
Here's a breakdown of the acceptance criteria and study information for the Limbus Contour device based on the provided FDA 510(k) summary:
Acceptance Criteria and Device Performance
The document does not explicitly state a table of "acceptance criteria" with numerical targets and reported performance in a pass/fail format. Instead, it refers to validation testing to demonstrate that the software meets "user needs and intended uses" and performs "in accordance with specifications." The implicit acceptance is that the automatic contouring function is accurate and comparable to the predicate device.
However, based on the general context of premarket submissions, the underlying acceptance criteria for automatic contouring typical involves metrics related to overlap, distance, and shape similarity. Since specific numerical metrics and targets are not provided, I will infer the successful demonstration of functionality as the reported device performance.
Acceptance Criteria Category | Reported Device Performance |
---|---|
Automatic Contouring | Validation testing demonstrated that the software meets user needs and intended uses for automatic contouring and performs in accordance with specifications. Performance is comparable to the predicate device. |
Study Details
2. Sample Size for Test Set and Data Provenance
The document does not specify the exact sample size (number of cases or patients) used for the test set. It mentions "Validation testing of the following functions of the Limbus Contour application demonstrated that the software meets user needs and intended uses and to support substantial equivalence: Automatic Contouring – Validation Test."
The data provenance (economic area, retrospective/prospective) is also not specified in the provided text.
3. Number of Experts and Qualifications for Ground Truth
The document does not specify the number of experts or their qualifications used to establish the ground truth for the test set. It mentions "pre-trained and expert curated machine learning models" in the device description, implying expert involvement in the training data, but does not explicitly describe it for the test set.
4. Adjudication Method
The document does not specify any adjudication method (e.g., 2+1, 3+1) used for establishing the ground truth of the test set.
5. Multi-Reader Multi-Case (MRMC) Comparative Effectiveness Study
No Multi-Reader Multi-Case (MRMC) comparative effectiveness study was mentioned, nor was any effect size of human readers improving with AI assistance reported. The study focused on the standalone performance of the device and its substantial equivalence to a predicate.
6. Standalone Performance Study
Yes, a standalone (algorithm only without human-in-the-loop performance) study was performed. The document explicitly states: "Validation testing of the following functions of the Limbus Contour application demonstrated that the software meets user needs and intended uses and to support substantial equivalence: Automatic Contouring – Validation Test." This is a validation of the autonomous contouring capability.
7. Type of Ground Truth Used
The document does not explicitly state the type of ground truth used (e.g., expert consensus, pathology, outcomes data) for the test set. The device description mentions "expert curated machine learning models," which suggests expert annotations were used at some stage, likely for training. However, the exact nature of the ground truth for the independent validation (test) set is not detailed.
8. Sample Size for Training Set
The document does not specify the sample size used for the training set.
9. How Ground Truth for Training Set was Established
The device description states that the Limbus Contour operates via "pre-trained and expert curated machine learning models." This indicates that the ground truth for the training set was established through expert curation, meaning medical experts (likely radiation oncologists, dosimetrists, or physicists) manually contoured structures which were then used to train the machine learning models.
Ask a specific question about this device
Page 1 of 1