Search Results
Found 3 results
510(k) Data Aggregation
(133 days)
The Arthrex VIP Web Portal is intended for use as a software interface of imaging information from a medical scanner such as a CT scanner. It is also intended as software for displaying/editing implant placement and surgical treatment options that were generated in the OrthoVis Desktop Software by trained Arthrex technicians. The Arthrex VIP Web Portal is intended for use with the Arthrex Glenoid Intelligent Reusable Instrument System (Arthrex Glenoid IRIS) and with the Arthrex OrthoVis Preoperative Plan. It is indicated for use with the following glenoid implant lines: Arthrex Univers II and Arthrex Univers Apex, Keeled or Pegged Glenoid components, the Vault Lock Glenoid Component, as well as the Univers Revers and Modular Glenoid System (MGS) Baseplate components.
The ArthrexVIP Web Portal is composed of software intended for use to facilitate upload of medical images, preoperative planning, and plan approval of placement and orientation of total shoulder joint replacement components. Each surgeon user's uploaded images are associated with specific cases and associated with that surgeon's profile. Uploaded images can be downloaded from the portal by Arthrex technicians and used to create preoperative plans (see 510(k) K151568) in the OrthoVis Desktop Software. The surgeon user is then able to login to the ArthrexVIP Web Portal to review the preoperative plan and either approve or modify the location and/or orientation of the joint replacement component. The approved plan is then downloaded by Arthrex technicians for production (see 510(k) K151500 and K151568) as part of the Arthrex Glenoid IRIS device.
The ArthrexVIP Web Portal is a software device intended for managing imaging information and displaying/editing implant placement and surgical treatment options for shoulder joint replacement.
Here's an analysis based on the provided text:
1. Table of Acceptance Criteria and Reported Device Performance
The provided document does not explicitly state specific quantitative acceptance criteria or corresponding reported device performance metrics for the ArthrexVIP Web Portal in a typical medical device performance study context (e.g., sensitivity, specificity, accuracy against a gold standard).
Instead, the "acceptance criteria" are implied by the non-clinical testing performed to demonstrate substantial equivalence to a predicate device. The performance is assessed through various software testing methodologies rather than clinical outcomes or diagnostic accuracy.
Implied Acceptance Criteria (based on testing performed):
Acceptance Criteria Category | Description (as implied by testing) | Reported Device Performance |
---|---|---|
Software Functionality | The software must function as intended for uploading, transferring, displaying, and editing imaging information and surgical plans, maintaining data integrity, and allowing for plan approval/modification. Changes from predicate must be correctly integrated. | Successfully passed Software Verification and Validation. |
Software Quality | The software code must be robust, free of critical defects, and correctly integrated. | Successfully passed Regression Testing, Unit Testing, Code Reviews and Checks, Integration Testing. |
Dimensional Accuracy | The display/editing of implant placement and surgical treatment options, including new implant models and features, must be dimensionally accurate and consistent with the intended design. | Successfully passed Dimensional Verification. |
Intended Use Fulfillment | The device fulfills its intended use as a software interface, for transfer of imaging data, and for displaying/editing plans generated by trained Arthrex technicians, for the specified glenoid implant lines, including the added Modular Glenoid System baseplate. | Demonstrated through software testing that the device functions as described for its intended use and expanded implant lines. |
2. Sample Size Used for the Test Set and Data Provenance
The document does not specify a sample size for a test set in the traditional sense of a clinical or imaging dataset. The testing performed is primarily software verification and validation, regression testing, unit testing, code reviews, integration testing, and dimensional verification.
- Test Set: Not applicable in the context of clinical images/data. The "test set" would refer to the various software modules, functions, and simulated data used during the V&V processes.
- Data Provenance: Not applicable. No patient data provenance is mentioned as this is not a device that directly analyzes patient data for diagnostic or treatment recommendations. The "data" being tested are software inputs, outputs, and internal states.
3. Number of Experts Used to Establish Ground Truth for the Test Set and Qualifications
This information is not provided and is generally not applicable for this type of software-only device where "ground truth" relates to functional correctness and dimensional accuracy rather than clinical interpretation of medical images. For dimensional verification, engineers or subject matter experts in CAD/imaging might define the "ground truth" for measurements, but the document does not elaborate.
4. Adjudication Method for the Test Set
This information is not provided and is generally not applicable. Software testing typically involves predefined test cases and expected outcomes. Discrepancies would be logged as bugs and resolved, rather than adjudicated by multiple experts in a consensus-based manner.
5. Multi-Reader Multi-Case (MRMC) Comparative Effectiveness Study
No MRMC comparative effectiveness study was done. The document explicitly states: "Clinical testing was not necessary to determine substantial equivalence to the predicate." This device is a software portal for image transfer and display/editing of pre-generated plans, not an AI diagnostic tool or an imaging modality that requires human reader performance evaluation.
6. Standalone Performance Study (Algorithm Only Without Human-in-the-Loop Performance)
Not applicable in the context of typical AI/CADe/CADx devices. This device is a portal and editing tool; it doesn't have an "algorithm" in the sense of making independent diagnostic or treatment recommendations that would require standalone performance metrics like sensitivity or specificity. Its standalone performance is assessed through its software verification and validation processes (i.e., does it perform its specific functions correctly, reliably, and accurately as a piece of software).
7. Type of Ground Truth Used
For the non-clinical testing:
- Software Verification and Validation, Regression Testing, Unit Testing, Code Reviews, Integration Testing: The ground truth is the software requirements specification and design documentation. The software must perform according to these documented specifications.
- Dimensional Verification: The ground truth would be based on engineering specifications, CAD models, or established measurement standards for the implant models and visualization tools.
8. Sample Size for the Training Set
Not applicable. The ArthrexVIP Web Portal is a software tool for data transfer, display, and editing of pre-generated plans. It does not utilize machine learning or AI that requires a "training set" of data.
9. How the Ground Truth for the Training Set Was Established
Not applicable, as there is no training set.
Ask a specific question about this device
(89 days)
The Arthrex VIP Web Portal is intended for use as a software interface of imaging information from a medical scanner such as a CT scanner. It is also intended as software for displaying/editing implant placement and surgical treatment options that were generated in the OrthoVis Desktop Software by trained COS technicians. The Arthrex VIP Web Portal is intended for use with the Arthrex Glenoid Intelligent Reusable Instrument System (Arthrex Glenoid IRIS) and with the Arthrex OrthoVis Preoperative Plan. It is indicated for use with the following glenoid implant lines: Arthrex Univers Apex, Arthrex Univers II, and Arthrex Univers Revers.
The ArthrexVIP Web Portal is composed of software intended for use to facilitate upload of medical images, preoperative planning, and plan approval of placement and orientation of total shoulder joint replacement components. Each surgeon user's uploaded images are associated with specific cases and associated with that surgeon's profile. Uploaded images can be downloaded from the portal by COS technicians and used to create preoperative plans (see 510(k) K151568) in the OrthoVis Desktop Software. The surgeon user is then able to login to the ArthrexVIP Web Portal to review the plan and either approve or modify the location and/or orientation of the joint replacement component. The approved plan is then downloaded by COS technicians for production (see 510(k) K151500 and K151568) as part of the Arthrex Glenoid IRIS device.
The provided document is a 510(k) summary for the ArthrexVIP Web Portal, which is a software device intended for use in preoperative planning for shoulder joint replacement. This document primarily focuses on demonstrating substantial equivalence to a predicate device and does not contain detailed information about a study proving the device meets specific acceptance criteria in the format requested.
Here's an attempt to answer the questions based on the limited information available in the document, along with an explanation of why certain information cannot be provided:
1. A table of acceptance criteria and the reported device performance
The document does not provide a table of acceptance criteria or specific reported device performance metrics tied to such criteria. The submission focuses on demonstrating "substantial equivalence" to a predicate device through non-clinical testing, rather than establishing performance against defined criteria.
2. Sample size used for the test set and the data provenance (e.g. country of origin of the data, retrospective or prospective)
The document does not specify a test set sample size or data provenance for any clinical performance evaluation. The non-clinical testing performed includes "Software verification and validation," "Regression Testing," "Unit Testing," "Code reviews and checks," and "Integration Testing." These are software development and quality assurance activities, not studies involving human subjects or medical image data in a clinical context.
3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts (e.g. radiologist with 10 years of experience)
No information is provided regarding experts, ground truth establishment, or clinical test sets.
4. Adjudication method (e.g. 2+1, 3+1, none) for the test set
No information is provided regarding adjudication methods, as no clinical test set is described.
5. If a multi-reader multi-case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance
No MRMC study was mentioned or performed. The device is a "software interface" for displaying/editing implant placement and surgical treatment options, not an AI-assisted diagnostic tool that would typically involve improving human reader performance.
6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done
No information is provided about a standalone algorithm performance study. The device is described as a "software interface" and a tool for displaying/editing options generated by "trained COS technicians" and reviewed by "surgeon users," indicating a human-in-the-loop design.
7. The type of ground truth used (expert consensus, pathology, outcomes data, etc.)
The document does not describe the establishment of a "ground truth" in a clinical sense for performance evaluation. The testing performed ("Software verification and validation," etc.) would involve internal quality metrics and ensuring the software functions as designed, rather than comparison to a clinical ground truth.
8. The sample size for the training set
No training set is mentioned. This device is not described as an AI/ML algorithm that learns from data in a training set. It is a software interface and planning tool.
9. How the ground truth for the training set was established
Not applicable, as no training set for an AI/ML algorithm is described.
Summary of what can be extracted from the document regarding acceptance criteria and studies:
The document states:
- Non-Clinical Testing: "The following testing was performed to demonstrate substantial equivalency of the ArthrexVIP Web Portal to the OrthoVis Web Portal: Software verification and validation, Regression Testing, Unit Testing, Code reviews and checks, Integration Testing, Dimensional Validation (performed on predicate device and code has not changed for the subject device)."
- Clinical Testing: "Clinical testing was not necessary to determine substantial equivalence between to the predicate."
This 510(k) submission relies on non-clinical software validation and verification activities to establish substantial equivalence to a predicate device, rather than explicit clinical performance criteria with associated studies involving patient data or experts. Therefore, most of the requested information regarding acceptance criteria, sample sizes, ground truth, and human reader performance is not present in this document.
Ask a specific question about this device
(106 days)
The Glenoid Intelligent Reusable Instrument System ("Glenoid IRIS") is a patient specific manual instrument system intended to facilitate preoperative planning and intraoperative placement of the central glenoid guide pin used in the preparation of the glenoid in total shoulder systems that utilize a central guide pin for preparing the glenoid to receive the glenoid implant.
The Arthrex Glenoid IRIS is indicated for use with the Arthrex Univers Apex, Keeled or Pegged Glenoid components as well as the Univers Revers Baseplate component.
The indications for use of the Arthrex shoulder systems with which the Arthex Glenoid IRIS is intended to be used are the same as those described in the labeling for these shoulder systems.
The SmartBase is a reusable instrument that allows the IRI device from the Arthrex Glenoid IRIS system (K151500) to be set according to prescribed lengths and heights for a specific patient's glenoid anatomy. The IRI leg lengths and their respective heights are determined in the OrthoVis software which is a part of the Arthrex Glenoid IRIS system. Along with the IRI leg lengths for each IRI slot and their respective heights, images of where the IRI was planned to sit on the patient's glenoid for the prescribed leg lengths and heights are given.
Loading the IRI device according to the prescribed lengths, setting the height of each IRI leg according to the prescribed SmartBase ruler heights, and then placing the IRI on the glenoid according to the preoperative plan images allows the IRI to transfer the guide pin for the surgeon-approved, preoperatively-planned glenoid implant trajectory to the patient in the OR. It provides the same function as the SmartBone (setting the IRI device), which was previously cleared in the predicate, K151500.
The document describes the SmartBase for Arthrex Glenoid IRIS, a reusable instrument intended to facilitate the intraoperative placement of the central glenoid guide pin during total shoulder replacement surgery. The device is a "patient-specific manual instrument system" designed to interface with the Arthrex Glenoid IRIS system.
Here's an analysis of the provided information regarding acceptance criteria and the study:
1. Table of Acceptance Criteria and Reported Device Performance
Performance Metric | Acceptance Criteria (Implied) | Reported Device Performance |
---|---|---|
Accuracy of Glenoid Guide Pin Placement | No significant difference compared to the predicate device (SmartBone). | "Results of the SmartBase vs. SmartBone study found that there is no significant difference between using the SmartBase or the SmartBone-Pin Trajectory instrument in placing the glenoid guide pin." |
Accuracy of IRI Leg Height Achievement | No significant error in achieving planned IRI leg heights. | "Results also found that there was no significant error in achieving the planned IRI leg heights with the SmartBase device." |
Biocompatibility | Conformance with ISO 16061 and FDA Guidance for common biocompatible materials. | The SmartBase device's indirectly patient-contacting parts (316 stainless steel) are composed of a commonly used biocompatible alloy, conforming to ISO 16061 and FDA guidance, thus no further biocompatibility testing was required. |
Sterilization/Cleaning Efficacy | Validated for cleaning and sterilization in the modified Arthrex Glenoid IRIS tray. | "Cleaning and Sterilization Validation was performed on the SmartBase device in a fully loaded Arthrex Glenoid IRIS Instrument tray." |
2. Sample Size Used for the Test Set and Data Provenance
The document explicitly mentions a "SmartBase vs. SmartBone Comparison Study." However, it does not specify the sample size used for this test set (e.g., number of cases, number of measurements).
The document also does not specify the data provenance (e.g., country of origin, retrospective or prospective) for this comparison study.
3. Number of Experts Used to Establish the Ground Truth for the Test Set and Qualifications of Those Experts
The document does not provide information on the number of experts used or their qualifications to establish ground truth for the comparison study. Given the nature of comparing instrument performance in placing a guide pin, the "ground truth" likely referred to the preoperatively planned pin trajectory or a direct measurement of the achieved pin trajectory relative to the plan, rather than a diagnostic interpretation by experts.
4. Adjudication Method for the Test Set
The document does not specify an adjudication method.
5. If a Multi-Reader Multi-Case (MRMC) Comparative Effectiveness Study Was Done, and Effect Size of AI Improvement
No, an MRMC comparative effectiveness study was not done. This device is a manual surgical instrument, not an AI or imaging diagnostic device. The study focused on comparing the performance of two physical instruments (SmartBase vs. SmartBone).
6. If a Standalone (Algorithm Only Without Human-in-the-Loop Performance) Was Done
No, a standalone (algorithm-only) performance study was not done. The SmartBase is a manual instrument used by a surgeon, and its performance is inherently tied to human interaction.
7. The Type of Ground Truth Used
The "ground truth" in the "SmartBase vs. SmartBone Comparison Study" appears to be defined by the preoperatively planned glenoid guide pin trajectory and the planned IRI leg heights. The comparison aimed to determine if the SmartBase could achieve these planned parameters as effectively as the predicate SmartBone. This is essentially a metrological or accuracy-based ground truth related to surgical planning and execution.
8. The Sample Size for the Training Set
The document does not mention a training set. This is because the device is a mechanical instrument, not an AI algorithm that requires training.
9. How the Ground Truth for the Training Set Was Established
Not applicable, as there is no training set for this device.
Ask a specific question about this device
Page 1 of 1