(140 days)
The Sonicision™ cordless ultrasonic dissection device is indicated for soft tissue incisions when bleeding control and minimal thermal injury are desired. The device can be used as an adjunct to or substitute for electrosurgery, lasers, and steel scalpels in general, plastic, gynecologic, urologic, exposure to orthopedic structures (such as spine and joint space) and other open and endoscopic procedures. The Sonicision™ cordless ultrasonic dissection device can be used to coagulate isolated vessels up to 5 mm in diameter.
The Sonicision™ 13 cm device is also indicated for use in otorhinolaryngologic (ENT) procedures.
The Sonicision™ Cordless Ultrasonic Dissector is a component of the Sonicision™ Cordless Ultrasonic Dissection Device, which is a hand-held surgical device consisting of three interdependent components that. when assembled, enable ultra-high-frequency mechanical motion (ultrasonic energy) to transect, dissect, and coagulate tissue.
The dissector is a sterile, single-use component to which the Sonicision reusable generators and battery packs attach. This component contains features essential to the control and performance of the assembled device; such as the clamping jaw, active blade, speaker, two-stage energy button, rotation wheel, and jaw lever.
Four configurations are available, differing principally by shaft lengths are 13 cm, 26 cm, 39 cm, and 48 cm; corresponding with catalog numbers SCD13, SCD26, SCD396, and SCD48, respectively. The only difference between SCD391 and SCD396 is the packaging configuration. There is no difference in the design of the device.
The provided text describes a 510(k) premarket notification for the Sonicision™ Cordless Ultrasonic Dissector, a surgical device. While it details performance testing, it does not explicitly state specific "acceptance criteria" numerical values that the device had to meet. Instead, the performance studies aim to demonstrate comparability or non-inferiority to the predicate device.
Therefore, I cannot populate a table with explicit acceptance criteria and corresponding reported performance numerical values as they are not presented in that format in the document. The document describes the type of performance tests conducted and the finding of comparability.
However, I can extract and present the information related to the studies in the requested format, interpreting "acceptance criteria" as the goal of the test (e.g., comparable burst strength) and "reported device performance" as the outcome in relation to the predicate.
Here's the breakdown of the information based on the provided text:
Acceptance Criteria and Device Performance
As explicit numerical acceptance criteria are not provided, the "acceptance criteria" are inferred as demonstrating comparability or non-inferiority to the predicate device for various performance aspects.
Acceptance Criteria (Inferred Goal) | Reported Device Performance (Study Outcome) |
---|---|
Ex-vivo burst strength of coagulated blood vessels comparable to predicate. | Blood vessels coagulated by the Sonicision™ Cordless Ultrasonic Dissector had comparable burst strength to the same type of blood vessels coagulated by the predicate. |
Maximum temperatures and cool-down times of active blade and shaft comparable to predicate after multiple activations on mesentery. | The device's maximum temperatures and cool down times of the active blade and shaft were comparable to the predicate's active blade and shaft after multiple activations on mesentery. |
Rates of hemostasis and lateral thermal spread comparable to predicate in acute in-vivo testing. | The Sonicision™ Cordless Ultrasonic Dissector and the predicate achieved comparable rates of hemostasis and comparable lateral thermal spread in acute in-vivo testing. |
Vessel hemostasis retention (up to 5 mm) for at least 21 days for coagulated vessels. | Vessels (up to 5 mm in diameter) coagulated by the Sonicision™ Cordless Ultrasonic Dissector maintain hemostasis for at least 21 days (demonstrated in chronic in-vivo testing). |
Meets user needs and FDA expectations (Human Factors Validation). | Human factors validation in a porcine model and a human cadaver model demonstrated that the Sonicision™ Cordless Ultrasonic Dissector meets user needs and FDA expectations. (This implies successful completion of simulated use tasks without critical errors). |
Study Details
-
Sample sizes used for the test set and the data provenance:
- Ex-vivo Burst Testing: Specific sample size not provided, but performed on "blood vessels."
- Ex-vivo Tissue Testing: Specific sample size not provided, but performed on "mesentery."
- Acute In-vivo Testing: Specific sample size not provided.
- Chronic In-vivo Testing: Specific sample size not provided.
- Human Factors Validation: Performed in a "porcine model and a human cadaver model." Specific number of participants/cadavers not provided.
- Data Provenance: The studies are described as "bench and animal tests" and human cadaver models, implying controlled laboratory or clinical simulation environments. The country of origin is not specified but is presumably where Covidien (Boulder, Colorado) conducts its R&D or contracted research. The studies are prospective in nature, designed specifically to test the device.
-
Number of experts used to establish the ground truth for the test set and the qualifications of those experts:
- The document does not specify the number or qualifications of experts involved in establishing ground truth for any of these performance tests. For animal and cadaver studies, typically veterinarians, pathologists, and surgeons/anatomists would be involved, but this information is not provided.
-
Adjudication method for the test set:
- The document does not describe any specific adjudication method (e.g., 2+1, 3+1) for the outcomes of these performance tests. The results seem to be presented as direct measurements or observations.
-
If a multi-reader multi-case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance:
- No, an MRMC comparative effectiveness study involving human "readers" or AI assistance was not done. This device is a surgical instrument, not an imaging diagnostic device that would typically involve human "readers" or AI for image interpretation. The human factors validation is about user interaction with the device, not diagnostic interpretation.
-
If a standalone (i.e., algorithm only without human-in-the-loop performance) was done:
- This is not applicable as the device is a physical surgical instrument and does not involve an AI algorithm with standalone performance. Its performance is always integrated with human use.
-
The type of ground truth used (expert consensus, pathology, outcomes data, etc.):
- Burst Strength: Likely measured mechanically (physics-based ground truth).
- Temperatures/Cool Down Times: Measured using temperature sensors (physics-based ground truth).
- Hemostasis/Lateral Thermal Spread: Likely assessed visually, histologically (pathology-based), or through direct measurement during animal studies (outcomes/observational ground truth).
- Hemostasis Retention: Assessed through long-term observation in chronic animal studies (outcomes data).
- Human Factors Validation: Established by objective task completion metrics (e.g., successful dissection, coagulation, no user errors) and subjective user feedback, compared against pre-defined user needs and FDA expectations for usability.
-
The sample size for the training set:
- This concept (training set) is not applicable as the device is a physical surgical instrument, not a machine learning/AI model. Therefore, there is no "training set" in the conventional sense. The development of the device would involve engineering, design, and iterative testing, but not data-driven "training" like an AI.
-
How the ground truth for the training set was established:
- As there is no "training set" for an AI model, this question is not applicable to this device submission.
N/A