Search Results
Found 2 results
510(k) Data Aggregation
(41 days)
The HealthView ECG Manager is a computer system, used within a clinical network, which is intended to be used by trained professionals. It provides the ability to retrieve, store, edit, send and print ECGs and related digitized clinical documents through the use of on-screen measurement and editing tools. The system receives files (such as ECG, stress, or Holter) from any compatible device, displaying such data to the clinician for analysis and review.
The product does not modify the original ECG waveform and does not provide an automated ECG analysis.
HealthView ECG Manager (version 1.0), herein after referred to as ECG Manager, is a Programmable Diagnostic Computer system. It is a "software only" medical device, to be installed on a server and workstation(s) that meet the minimum hardware requirements noted in the documentation. The hardware itself is not considered a medical device and is not part of this 510(k) submission. The device provides a trained user with the ability to find, retrieve, view and integrate ECGs into a single patient record, to assist in the diagnosis and treatment planning of patients. The device does not contact the patient and does not control any life sustaining devices.
The provided document is a 510(k) summary for the HealthView ECG Manager. As such, it is designed to demonstrate substantial equivalence to predicate devices rather than fully detail a clinical study with acceptance criteria and performance data in the way a Pre-Market Approval (PMA) submission might.
Based on the provided text, here's a breakdown of the requested information:
1. A table of acceptance criteria and the reported device performance
The document does not explicitly present a table of acceptance criteria with corresponding device performance for a formal clinical study. Instead, it describes performance substantiation through a series of verification and validation activities.
| Acceptance Criteria (Inferred from Performance Test Data) | Reported Device Performance |
|---|---|
| Accuracy of measurement tools: Ability to provide accurate measurements comparable to cleared devices. | Verified accuracy of measurement tools using other cleared devices. |
| Speed of performance in simulated network environment: Efficient operation under specified network conditions. | Verified the speed of performance in a simulated network environment. |
| Retrieval speed at validation site: Timely retrieval of data in a real-world clinical setting. | Validated the retrieval speed at a validation site. |
| Tools accuracy at validation site: Accuracy of tools when used in a real-world clinical setting. | Validated tools accuracy at a validation site. |
| All identified requirements met: All functional and non-functional requirements are satisfied. | Every identified requirement has been tested and confirmed to be performing as expected (see Section 16, and provided screenshots of test files). |
| Risk management effectiveness: Potential hazards identified and mitigated according to ISO14971 (2007). | Risk management ensured by use of the ISO14971 (2007) standard; potential hazards individually confirmed to be controlled via verification and validation testing. |
2. Sample size used for the test set and the data provenance (e.g. country of origin of the data, retrospective or prospective)
The document does not specify a distinct "test set" sample size in the context of a clinical study or a specific number of ECGs or patient records used for performance testing. The performance substantiation relies on internal verification and validation activities, including "simulated network environment" and "validation site" testing, but no details on the number of cases or data provenance are provided.
3. Number of experts used to establish the ground truth for the test set and the qualifications of those experts (e.g. radiologist with 10 years of experience)
No information is provided regarding the number of experts or their qualifications used to establish ground truth for a test set. The document states that "Data created or modified on this device are evaluated by medical professionals," implying human oversight in clinical use, but this is not tied to a specific test set or ground truth establishment.
4. Adjudication method (e.g. 2+1, 3+1, none) for the test set
No adjudication method is described, as the document does not detail the creation of a ground truth for a specific test set using multiple experts.
5. If a multi reader multi case (MRMC) comparative effectiveness study was done, If so, what was the effect size of how much human readers improve with AI vs without AI assistance
A multi-reader multi-case (MRMC) comparative effectiveness study with human readers assisting AI or vice-versa was not conducted or reported. The device is explicitly stated to "not provide an automated ECG analysis" and is a "software only" system for managing ECGs, not an AI-driven diagnostic tool. It is intended for use by "trained and qualified professionals who have ample opportunity for competent human intervention in interpreting the waveforms and information presented to them."
6. If a standalone (i.e. algorithm only without human-in-the-loop performance) was done
This is not applicable. The device "does not provide an automated ECG analysis" and is designed to assist trained professionals in managing and reviewing ECGs. It is not an algorithm performing a diagnostic task independently.
7. The type of ground truth used (expert consensus, pathology, outcomes data, etc.)
The document does not describe the establishment of a specific "ground truth" using expert consensus, pathology, or outcome data for performance evaluation in a clinical study context. The performance verification focuses on system functionalities like measurement tool accuracy and data retrieval speed, presumably against known standards or expected outputs.
8. The sample size for the training set
This information is not applicable as the HealthView ECG Manager is a software system for managing ECG data, not an AI/ML device that requires training data for a diagnostic algorithm.
9. How the ground truth for the training set was established
This information is not applicable for the same reason as point 8.
Ask a specific question about this device
(135 days)
CardioPACS is a software device intended to be used by medical professionals, for storage, review, query/ retrieve, analysis and post processing of DICOM medical images as may be generated by echocardiography, radiology and other modalities. The device may be used as a stand-alone product, or in a networked system.
CardioPACS is not intended to be used for reading of mammography images.
HealthView CardioPACS (version 6.0), herein after referred to as CardioPACS, is a Picture Archive Communications System. It is a "software only" medical device, to be installed on a server and workstation(s) that meet the minimum hardware requirements noted in the documentation. The hardware itself is not considered a medical device and is not part of this 510(k) submission. The device provides a trained user with the ability to find, retrieve, view, edit and manipulate images on a workstation, to assist in the diagnosis and treatment planning of patients. The device does not contact the patient and does not control any life sustaining devices.
CardioPACS is a software-only DICOM-compliant device that can be used on multiple hardware platforms (provided that the minimum hardware requirements are met) that allows viewing, editing, measuring and other digital image processing.
The provided text is a 510(k) summary for the CardioPACS (version 6.0) device. This type of submission focuses on demonstrating substantial equivalence to a predicate device rather than providing detailed clinical study results to meet specific performance acceptance criteria.
Therefore, the document does not contain information about:
- A table of acceptance criteria and reported device performance (in the context of clinical metrics like sensitivity/specificity).
- Sample size used for a test set or its data provenance.
- Number of experts used to establish ground truth or their qualifications.
- Adjudication method for a test set.
- MRMC comparative effectiveness study or human-AI improvement effect size.
- Standalone algorithm performance (as it is a PACS system designed for human use).
- Type of ground truth used for performance evaluation (e.g., pathology, outcomes data).
- Sample size for a training set.
- How ground truth for a training set was established.
Information Provided Regarding Performance and Testing:
The document states the following under "Performance Test Data":
- Acceptance Criteria Mentioned (Implicit/General): "Every identified requirement has been tested and confirmed to be performing as expected."
- Study/Testing Methods: Performance has been substantiated in multiple ways:
- Verifying accuracy of measurement tools using other cleared devices.
- Verifying the speed of performance in a simulated network environment.
- Validating retrieval speed at a validation site.
- Validating tools accuracy at a validation site.
- Compliance: The device has been tested to confirm compliance with voluntary standard DICOM version 3.0.
- Risk Management: ISO14971 (2007) standard was used to identify and mitigate potential hazards. Verification and validation testing confirmed control of these hazards.
In summary, the document describes general performance testing related to software functionality, DICOM compliance, and accuracy of measurement tools against other cleared devices, rather than a clinical study with specific diagnostic performance metrics (e.g., sensitivity, specificity) against a well-defined ground truth in a clinical population. The purpose of this 510(k) is to demonstrate substantial equivalence to predicate PACS systems, which are image management and viewing systems, not AI-driven diagnostic algorithms requiring extensive clinical performance studies.
Ask a specific question about this device
Page 1 of 1