(45 days)
PeekMed is a software system designed to help surgeons' specialists carry out the pre-operative planning in a prompt and efficient manner for several surgical procedures, based on their patients' imaging studies.
The software imports diagnostics imaging studies such as x-rays, CT or magnetic resonance image (MRI). The import process can retrieve files from a CD ROM, a local folder or the PACS. In parallel, there is a database of digital representations related to prosthetic materials supplied by their producing companies.
PeekMed allows health professional to digitally perform the surgical planning without adding any additional steps to that process. This software system requires no imaging study acquisition specification (no protocol). Experience in usage and a clinical assessment are necessary for a proper use of the software.
PeekMed is a standalone software application that helps specialist doctors and surgeons to perform the pre-surgical planning for different procedures in a fast and effective way, based on the imaging studies of the patients.
PeekMed is a 3D pre-operative planning software for surgery. This software system allows surgeons to plan a surgery procedure simulating various environments, from hybrid (2D/3D) to 3D or 2D environments.
The software imports diagnostic imaging studies such as X-rays, Computed Tomography (CT) or Magnetic Resonance Image (MRI). The import process can retrieve files from a local folder or the Picture Archiving and Communications System (PACS) of the hospital/health center. In parallel, there is a database containing digital representations related to prosthetic materials supplied by their respective manufacturers. This offers the possibility of inserting templates of the materials to be used in the surgery in addition to the measurements, making a complete overview of the surgery.
In the case of a 3D or hybrid environment, the surgeon can resort to 3D model generated from a previous imaging study on the patient and 3D digital representations of the prosthetic material to be used during surgery, i.e. screws, fixation plates or full prosthesis, deriving from several producing companies.
The PeekMed device is a software system designed for pre-operative planning in various surgical procedures, based on patient imaging studies.
Here's an analysis of the acceptance criteria and the study that proves the device meets them:
1. Table of Acceptance Criteria and Reported Device Performance
The provided document describes the validation activities and states that "Acceptance criteria were achieved for all tests." However, it does not explicitly list the quantitative acceptance criteria for each specific performance metric. Instead, it describes the type of validation performed and the goals of that validation.
Based on the information provided, we can infer the following:
Acceptance Criteria (Implied) | Reported Device Performance |
---|---|
Software Functionality: All specified software functions operate as intended. | "Verification testing consisted of specific software functionalities testing and system level testing. Acceptance criteria, defined in the product requirements, were met for each verification test and are described in JIRA." This indicates that key functionalities, such as importing images, digital templating, measurement tools (ruler, angle), and 2D/3D planning environments, were tested and met their predefined criteria. |
Measurement Accuracy and Repeatability: Lengths and angles measured by the software accurately and repeatedly match real dimensions. | "To reassure that the lengths and angles measured with the internal functions of 'ruler' and 'angle' of the PeekMed software effectively and repeatedly match the real dimensions, validation consists the measuring of images of three implantable medical devices (prosthesis), CE marked and with strictly defined dimensions. Validation phase ensures that all product requirements have been fulfilled, meets the end-users needs, and ensure the safety and proper performance of the device." |
Fulfillment of Product Requirements: All defined product requirements are met. | "Validation phase ensures that all product requirements have been fulfilled..." |
End-User Needs: The software meets the needs of end-users (surgeons). | "...meets the end-users needs..." and "Also, satisfaction questionnaires were made to assess the usability of the PeekMed software when comparing with others in the market, and also to make sure that the device operates as intended during the design stage." |
Safety and Proper Performance: The device operates safely and performs properly. | "...and ensure the safety and proper performance of the device." |
Usability: The software is usable for its intended purpose. | "Also, satisfaction questionnaires were made to assess the usability of the PeekMed software when comparing with others in the market..." |
Effectiveness of 3D and Hybrid Planning (compared to 2D only): The new 3D and hybrid planning features do not raise new safety or effectiveness concerns. | The "Significant Differences" section of the "Comparison of Characteristics" table states: "In more to 2D, PeekMed can offer a 3D pre-surgical planning or a hybrid 2D/3D environment in addition to isolated 2D and 3D. PeekMed has been tested and validated for 3D and hybrid planning." And for the additional feature of allowing intersection of models: "The additional feature from PeekMed of allowing the intersection of the models has been tested and validated and does not raise different questions of safety or effectiveness." |
Support for diversified Orthopedic Subspecialties: The procedures not common to both PeekMed and the predicate have been tested and validated. | The "Significant Differences" table notes: "The procedures that are not common to both devices have been tested and validated through PeekMed development. It does not raise different questions of safety or effectiveness." |
2. Sample Size Used for the Test Set and Data Provenance
The document does not specify a distinct "test set" with a particular sample size in the context of a clinical study for performance evaluation. Instead, the validation involved:
- Internal validation: "measuring of images of three implantable medical devices (prosthesis), CE marked and with strictly defined dimensions." This sample size (n=3) is for validating measurement accuracy.
- External validation/Follow-up: "continuous follow-up from the Marketing and Sales team, follow-up on events registered in the platform mixpanel and user/customer surveys."
- Usability questionnaires: Sample size is not specified but implies a group of users.
Data Provenance: The document does not explicitly state the country of origin for the data used in validation. It mentions "images of three implantable medical devices (prosthesis), CE marked," which are likely standardized devices rather than patient data. The overall development and submission are from Portugal. The nature of the validation implies retrospective testing on the selected images and prospective feedback (user/customer surveys, event follow-up).
3. Number of Experts Used to Establish the Ground Truth for the Test Set and Their Qualifications
The document states that validation was "performed internally prior to the release to the market by qualified personnel (personnel with background in anatomy and biomedical field)". It does not specify a number of experts who established ground truth for the test set (the three implantable devices). For these devices, the "strictly defined dimensions" act as the inherent ground truth, meaning no external expert adjudication was needed to establish it.
For overall clinical assessment and user feedback, it mentions "surgeons' specialists" as the intended users and that "Experience in usage and a clinical assessment are necessary for a proper use of the software," implying that the feedback and assessment come from qualified medical professionals, but a specific number and detailed qualifications (e.g., years of experience) are not provided.
4. Adjudication Method for the Test Set
For the measurement validation using the three implantable devices, no adjudication method (like 2+1 or 3+1) is indicated because the "strictly defined dimensions" of the CE-marked prostheses served as the objective ground truth. The software's measurements were compared against these known dimensions.
For broader validation including user satisfaction and functionality, adjudication methods are not typically applicable in the same way as for diagnostic accuracy studies.
5. Multi-Reader Multi-Case (MRMC) Comparative Effectiveness Study
No MRMC comparative effectiveness study is explicitly mentioned in the provided text. The submission focuses on demonstrating substantial equivalence to a predicate device (TraumaCad version 2.0) based on similar indications for use, technological characteristics, and performance testing, rather than a comparative effectiveness study showing improvement with AI assistance.
6. Standalone Performance (Algorithm Only without Human-in-the-Loop Performance)
The device is described as "a software system designed to help surgeons' specialists carry out the pre-operative planning." It "allows health professional to digitally perform the surgical planning," and "human intervention for image interpretation" is explicitly listed as "Yes." This indicates that the device is an aid to a human professional and not intended for standalone use without a human in the loop for interpretation and decision-making. Therefore, a standalone (algorithm only) performance study as typically understood for diagnostic AI is not explicitly described or claimed. The validation focuses on the tools it provides to the surgeon.
7. Type of Ground Truth Used
- For the core measurement accuracy validation: Known physical dimensions of CE-marked implantable medical devices (prostheses).
- For overall functionality and user experience: Implied expert consensus/feedback from qualified personnel with anatomy/biomedical background during internal validation, and feedback from surgeons via satisfaction questionnaires and follow-up.
- For comparing to predicate: The predicate device's established performance records.
8. Sample Size for the Training Set
The document does not describe the use of machine learning or AI models in a way that would require a distinct "training set" for an algorithm. It is presented as a software tool for pre-operative planning. Therefore, a training set size is not applicable or provided.
9. How the Ground Truth for the Training Set Was Established
Since no training set for an AI/ML algorithm is mentioned, this point is not applicable. The device is a "Picture Archiving And Communications System" with image processing capabilities facilitating human planning, rather than an autonomous diagnostic AI.
§ 892.2050 Medical image management and processing system.
(a)
Identification. A medical image management and processing system is a device that provides one or more capabilities relating to the review and digital processing of medical images for the purposes of interpretation by a trained practitioner of disease detection, diagnosis, or patient management. The software components may provide advanced or complex image processing functions for image manipulation, enhancement, or quantification that are intended for use in the interpretation and analysis of medical images. Advanced image manipulation functions may include image segmentation, multimodality image registration, or 3D visualization. Complex quantitative functions may include semi-automated measurements or time-series measurements.(b)
Classification. Class II (special controls; voluntary standards—Digital Imaging and Communications in Medicine (DICOM) Std., Joint Photographic Experts Group (JPEG) Std., Society of Motion Picture and Television Engineers (SMPTE) Test Pattern).