Search Filters

Search Results

Found 1 results

510(k) Data Aggregation

    K Number
    K241913
    Date Cleared
    2024-08-28

    (58 days)

    Product Code
    Regulation Number
    876.1500
    Reference & Predicate Devices
    Why did this record match?
    Reference Devices :

    K202571, K232773

    AI/MLSaMDIVD (In Vitro Diagnostic)TherapeuticDiagnosticis PCCP AuthorizedThirdpartyExpeditedreview
    Intended Use

    The da Vinci Handheld Camera is intended for endoscopic viewing of internal surgery sites during minimally invasive surgery. It is designed for use with compatible da Vinci Surgical Systems.

    Device Description

    The da Vinci Handheld Camera is a lightweight handheld 2D Camera which can be connected to any third party 5 mm to 10 mm laparoscope to view images on the compatible da Vinci System Vision cart. The da Vinci Handheld Camera consists of the camera head, the light guide, and the light guide adapter. The da Vinci Handheld Camera Light Guide is a detachable device that connects to the third party laparoscope via a Handheld Camera Light Guide Adapter and to the da Camera comector prior to connecting it to the Endoscope controlle da Vinci Surgical System. The camera is reusable and is not provided sterile to the users.

    The da Vinci Handheld Camera connects to a compatible da Vinci Surgical System Vision Cartroller. The da Vinci Handheld Camera. Light Guide and Light Guide Adapter are designed first entry and laparoscopic tasks during robotic procedures, prior to docking the compatible da Vinci Surgical System patient cart. It is intended to be used by surgeons, circulating nurses (non-sterile user) and scrub nurses (sterile user), in a hospital operating room (OR).

    AI/ML Overview

    The provided text is a 510(k) Premarket Notification from the FDA for a medical device called the "da Vinci Handheld Camera." It details the regulatory process, device description, and a summary of non-clinical and clinical tests performed to demonstrate substantial equivalence to a predicate device.

    However, the document states that there are "no changes to the subject device design, material, and fundamental technology" compared to the predicate device. The changes are solely related to "labeling and da Vinci SP1098 System software to enable compatibility."

    Therefore, the document does not describe a study that proves the device meets specific performance acceptance criteria through direct testing of the device's fundamental function, as the performance is assumed to be identical to the predicate. Instead, the testing described focuses on:

    • Animal Validation: Primarily to evaluate the safety and efficacy of the Handheld Camera when used with another specific accessory (SP Access Port Kit) in a simulated clinical setting, focusing on vision parameters. This is not a direct test of the Handheld Camera's primary functional performance against quantitative acceptance criteria for image quality, resolution, etc., but rather a system-level evaluation.
    • Bench Verification: Explicitly states that "Therefore, design and Reliability verification was not performed, and test results provided in the predicate submission K191043 remains valid and applies to the subject device." This means no new bench testing against performance acceptance criteria was performed for the device itself.
    • Cybersecurity Verification: To ensure compatibility with new software and that no new cybersecurity risks were introduced.
    • Software User Interface Verification: To confirm compatibility with the new da Vinci SP Surgical System software.
    • Human Factors Evaluation: To analyze use-related risks and ensure safe and effective interaction with the device in its updated context.

    Given this information, it is not possible to extract the specific acceptance criteria and the detailed study proving the device meets them from the provided text, as the core performance of the device's camera functionality was not re-evaluated. The document relies on the substantial equivalence to the predicate device (K191043) for those aspects.

    Therefore, many of the requested fields cannot be directly answered from the provided text because the "study that proves the device meets the acceptance criteria" for its core function (e.g., image quality, resolution) was not performed on this specific submission, but rather referenced from the predicate device's clearance.


    Based on the provided text, here's what can be extracted, and where information is explicitly stated as not applicable or not re-evaluated for this specific submission:

    1. A table of acceptance criteria and the reported device performance

    Acceptance CriteriaReported Device Performance
    Specific quantitative performance criteria for image quality, resolution, illumination, etc., of the Handheld Camera (the "device")Not provided in this document. The document states: "The design input, mechanical components, and device features are identical between the subject and predicate Handheld Camera K191043. Therefore, design and Reliability verification was not performed, and test results provided in the predicate submission K191043 remains valid and applies to the subject device." The animal validation focused on "vision parameters of the subject device when used with the SP Access Port Kit", but did not provide quantitative performance metrics.
    Cybersecurity Requirements"Testing confirmed the subject device meets Cybersecurity requirements and identified no issues of safety or effectiveness and no new risks."
    Software User Interface Compatibility"Testing confirmed that the test article met design inputs as documented in the user interface specification of the da Vinci SP Surgical System."
    Safety and Effectiveness (Human Factors)"Results from the performance data indicate that the subject da Vinci Handheld Camera is substantially equivalent to the predicate device (K191043)." "no issues of safety or effectiveness and no additional unexpected risks were identified."
    Design Outputs Fulfilled (Animal Validation)"Design validation activities demonstrated that the design outputs fulfill the user needs and that the intended use have been met." (This is a qualitative statement based on "vision assessments" in porcine models, not specific quantitative criteria for the device itself).

    2. Sample sized used for the test set and the data provenance

    • Animal Validation: "In-vivo testing with a live porcine model". The exact sample size (number of animals) is not specified. Data provenance is simulated clinical setting using live porcine models.
    • Bench Verification: Not applicable for new testing of device performance.
    • Cybersecurity Verification: Not specified (likely a system-level test on the SP1098 System with the camera).
    • Software User Interface Verification: Not specified (likely system-level testing on the SP1098 System with the camera).
    • Human Factors: Analysis of "post-market data and the MAUDE database" (retrospective, likely US data) and formative usability evaluations (details on sample size or specific study type for these are not provided).

    3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts

    • Not applicable for the core device performance evaluation as this was not re-evaluated.
    • For the Animal Validation (safety/efficacy with an accessory): The evaluators/observers are referred to as performing "vision assessments," but their number and qualifications are not specified.
    • For Human Factors: "Formative usability evaluations" were conducted, but details on evaluators, their number, or qualifications are not provided.

    4. Adjudication method (e.g. 2+1, 3+1, none) for the test set

    • Not applicable and not described for any of the testing mentioned, as these were primarily engineering and system compatibility verification activities, not reader studies requiring adjudication.

    5. If a multi reader multi case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance

    • No MRMC study was performed or described. The device is a "Handheld Camera" and not an AI-powered diagnostic tool, so this type of study would not be relevant.

    6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done

    • Not applicable. The device is a physical camera for endoscopic viewing, not a standalone algorithm.

    7. The type of ground truth used (expert consensus, pathology, outcomes data, etc)

    • For Animal Validation: The "ground truth" was likely the direct observation by the evaluators of the "vision parameters" in the live porcine model. This is an in-vivo assessment, not based on pathology or expert consensus.
    • For Cybersecurity/Software UI: Ground truth was the design inputs/specifications.
    • For Human Factors: Ground truth was identified use-related risks, user tasks, and observed user interaction.

    8. The sample size for the training set

    • Not applicable. This device is a hardware camera with software for compatibility, not an AI model requiring a training set.

    9. How the ground truth for the training set was established

    • Not applicable. This device is a hardware camera with software for compatibility, not an AI model requiring a training set.
    Ask a Question

    Ask a specific question about this device

    Page 1 of 1