Author name: Robert Packard

FDA eSTAR v5.0 – What’s new?

This blog provides a deep dive into the newest version of the FDA eSTAR, version 5.0, released on December 6, 2023.

Why did the FDA release the new eSTAR version as v5.0 instead of v4.4?

A major version update consists of policy changes, regulatory changes, or major changes to the template and will be denoted by a major version number increment (e.g. 4.3 to 5.0). A minor version update will consist of other changes and will be denoted by a minor version number increment (e.g. 4.3 to 4.4). If there are policy or regulatory changes, a new major version of the eSTAR is made before the implementation date, and the previous version of the eSTAR is removed. In this case, enabling PMA content, updates to the international pilot of the eSTAR with Health Canada, and implementation of cybersecurity documentation requirements are considered major changes that trigger the need for a major version update (i.e., 5.0) instead of a minor version update (i.e., 4.4). These changes apply to the IVD eSTAR and the non-IVD eSTAR. If you are generally unfamiliar with the FDA eSTAR, please visit our 510k course page.

What is the deadline for using v5.0?

Version 4.3 of the FDA eSTAR, both the nIVD and IVD versions, will be removed from the FDA website on February 4, 2024. Any submissions that are submitted with an expired version of the eSTAR will be rejected. If you have already uploaded information to an older version of the template, you will need to scroll to the bottom of the eSTAR and export the data to an HTML file. Then you import the HTML file into the newer version of the eSTAR. Any attachments you made to the older version of the template will not be exported, and you will have to attach all of the attachments to the new template.

Import Export Function in FDA eSTAR 1024x423 FDA eSTAR v5.0   Whats new?

PMA content is enabled in the new FDA eSTAR

Previous versions of the FDA eSTAR included the functionality for premarket approval (PMA) submissions, but in version 5.0 the FDA finally enabled this functionality. 510k submissions have three types: 1) Traditional, 2) Abbreviated, and 3) Special. PMA submissions also have different types. There are two types of PMA submissions for a new device: traditional and modular. Unfortunately, the FDA eSTAR is not intended for PMAs using the modular approach. For Class 3 devices, the FDA has more stringent controls over changes than Class 1 and 2 devices. Therefore, a PMA supplement is required for the following types of changes to PMA-approved devices:

  • new indications for use;
  • labeling changes;
  • facility changes for manufacturing or packaging;
  • changes in manufacturing methods;
  • changes in quality control procedures;
  • changes in sterilization procedures;
  • changes in packaging;
  • changes in the performance or design specifications, and
  • extension of the expiration date.

There are several types of PMA supplements, but only three types of supplements can use the FDA eSTAR: 1) Panel-Track, 2) 180-Day, and 3) Real Time. To determine which type of PMA supplement you should use, the FDA published guidance for modifications to devices subject to the premarket approval process.

PMA Content

The following sections in the FDA eSTAR are specific to PMA submission content requirements:

  • Quality Management System Information
  • Facility Information
  • Post-Market Study (PMS) Plans
  • Attach an exclusion statement, or an Environmental Assessment Report in accordance with 21 CFR 814.20(b)(11)

Health Canada is conducting a pilot with the FDA eSTAR

Health Canada’s FDA eSTAR pilot is now full with a total of 10 participants (originally only 9 were planned). The pilot will test the use of eSTAR for applications submitted to Health Canada. The results of the pilot should be complete soon, and then we expect an extension of the pilot to a broader number of applicants. We heard rumors that the HC eSTAR was overly complicated. Hopefully, v5.0 is simplified.

Were there any changes to the EMC testing section?

EMC Labeling questions were consolidated into a single question instead of four because only one citation is usually provided in this section. A copy of the older version is provided below.

Additional ISO 18562 help text 1024x395 FDA eSTAR v5.0   Whats new? Old EMC Labeling Section 1024x506 FDA eSTAR v5.0   Whats new?

The updated version 5.0 is shown below and has only one question, but the help text was changed.

Pointing to help text box 1024x250 FDA eSTAR v5.0   Whats new?

Does the FDA eSTAR now require more cybersecurity documentation?

Bhoomika Joyappa updated our cybersecurity work instruction (WI-007) to address the updated FDA guidance for cybersecurity documentation. The revisions were completed earlier this month, and you can purchase the updated templates on our website. We have also been telling our subscribers to anticipate a significant revision to the FDA eSTAR cybersecurity requirements 300x71 FDA eSTAR v5.0   Whats new?template when this happens. The release of the updated eSTAR version took a little over two months, and the change resulted in a three-page section dedicated to cybersecurity documentation. The previous versions of the template included a requirement for documentation of cybersecurity risk management and a cybersecurity management plan/plan for continuing support. The following documents must be attached in this section if cybersecurity applies to your device:

  1. risk management – report (attach)
  2. risk management – threat model (attach)
  3. list of threat methodology (text box)
  4. verification that the threat model documentation includes (yes/no dropdown):
    1. global system view
    2. Multi-patient harm view
    3. Updateability/patchability view
    4. Security use case views
  5. cybersecurity risk assessment (attach)
  6. page numbers where methodology and acceptance criteria are documented (text box)
  7. verification that the risk assessment avoids using probability for the likelihood assessment and use exploitability instead (yes/no dropdown)
  8. software bill of materials or SBOM (attach)
  9. software level of support and end-of-support date for each software component (attach)
  10. operating system and version used (text box)
  11. safety and security assessment of vulnerabilities (attach)
  12. assessment of any unresolved anomalies (attach)
  13. data from monitoring cybersecurity metrics (attach)
  14. information about security controls (attach)
  15. page numbers where each security control is addressed (text box):
    1. Authentication controls
    2. Authorization controls
    3. Cryptography controls
    4. Code, data, and execution integrity controls
    5. Confidentiality controls
    6. Event detection and logging controls
    7. Resiliency and recovery controls
    8. Firmware and software update controls
  16. architecture views (attach)
  17. cybersecurity testing (attach)
  18. page numbers where cybersecurity labeling is provided (text box)

Sterility section changes include an updated question on EO residuals

In the sterility section of the FDA eSTAR there was a question about sterilant residues. Specifically, the question was “What are the maximum levels of sterilant residual that remain on the device?” The space provided for entering the information was small as well.

EO residue help text 1024x568 FDA eSTAR v5.0   Whats new?

Now the question is reworded to: “What are the maximum levels of sterilant residuals that remain on the device, and what is your explanation for why those levels are acceptable for the device type and the expected duration of patient contact?” No change was made to the help text for this question.

In addition to the changes in the sterility section regarding EO residuals, the FDA also modified the dropdown menu and the help text for pyrogenicity testing. There were options for “LAL” and “Rabbit Test” separately, but now these are combined into “LAL and Rabbit Pyrogen Test.” In addition, the following help text was added: “If you previously conducted rabbit testing on these materials, please either: 1) reference this testing according to the submission number in your attached Pyrogenicity documentation and specifically cite the attachment(s) and page number(s) where the testing is found in that submission, or 2) attach your previous test report.”

Pyrogenicity help text 1024x647 FDA eSTAR v5.0   Whats new?

What is the deadline for using v5.0?

Many clients say that they get an error message when they try to open the FDA eSTAR template. This is because they are opening the eSTAR from a PDF viewer instead of Adobe Acrobat Pro.

Please wait 1024x400 FDA eSTAR v5.0   Whats new?

Some people want to save money by using the free Adobe Acrobat Reader software instead, but this will not allow you to complete the eSTAR properly. Therefore, the FDA added a Popup message if Adobe Acrobat Reader is used.

How are devices with a breathing gas pathway evaluated for biocompatibility?

In the screen capture below, I have intentionally selected “Surface Device: Mucosal Membrane” as the type of tissue contact for a breathing gas pathway device because the device will have a mouthpiece placed in your mouth (i.e., mucosal membrane). This is a common mistake. In version 5.0 of the FDA eSTAR, the FDA clarifies that these devices should be evaluated as “externally communicating” and the tissue contact is “tissue/bone/dentin.” Specifically, the tissue contact is the lungs. For this reason, the FDA added the help text shown below in the JavaScript Window regarding the applicability of ISO 18562-1, -2, -3, and -4.

ISO 18562 references for biocompatibility 802x1024 FDA eSTAR v5.0   Whats new?

Additional questions and guidance will appear when you click on the individual blue boxes shown above. For the blue box labeled “Subacute/Subchronic,” you will find additional help text regarding the ISO 18562 standards. Similar help text is found when you click the blue box labeled “Acute Systemic & Pyrogenicity.”

Additional ISO 18562 help text 1024x395 FDA eSTAR v5.0   Whats new?

What is a cross-section change reminder?

One of the minor changes made in this FDA eSTAR version is the addition of “cross-section change reminders” to the help text in the device description section. This is not meant to help you avoid answering questions in your submission, because if you are missing a section of the submission because you answered “No” instead of “Yes” the FDA reviewer will identify this error during the Technical Review process. This will result in your submission being placed on hold and the review time clock will be reset to zero days when you resubmit with the corrections made. The screen capture below shows an example of one of these cross-section change reminders.

Cross section change reminder 1024x636 FDA eSTAR v5.0   Whats new?

What changes were made to the clinical testing section of the FDA eSTAR?

The clinical testing section will now display when using PDF-XChange Editor, but we recommend only using Adobe Acrobat Pro to edit the FDA eSTAR. This change is a bug fix, and it is specific to the nIVD eSTAR. The IVD eSTAR and the nIVD eSTAR both include a clinical testing section within the performance testing section, but the performance testing section is found in the FDA eSTAR before the electrical safety and EMC testing section, while the performance testing section is found after the electrical safety and EMC testing section. If your company is planning to submit clinical data in a future FDA submission, we have the following recommendations:

  • watch the CDRH Learn webinars on the topic of 21 CFR 812
  • conduct a pre-submission teleconference to ask questions about your clinical study protocol before IRB submission or ethics review board submission
  • before you submit the pre-sub meeting request, look at what general clinical information the FDA wants for a De Novo or PMA submission in the FDA eSTAR

FDA eSTAR clinical section 873x1024 FDA eSTAR v5.0   Whats new?

Note: The clinical section shown above is only found in the FDA eSTAR if you select a De Novo or PMA submission. If you submit a 510k submission with clinical data, the clinical section will be abbreviated as shown below.

FDA eSTAR clinical section for 510k 859x1024 FDA eSTAR v5.0   Whats new?

FDA eSTAR v5.0 – What’s new? Read More »

Complaints handling mistakes – Why?

Complaints handling mistakes, medical device reporting, and CAPA are the most common reasons for FDA 483 inspection observations, but why?complaints Complaints handling mistakes   Why?

Reasons for FDA 483s related to the CAPA process

You should already be well aware that deficiencies in the CAPA process, complaints handling, and medical device reporting are the three most common reasons why the FDA issues 483 inspection observations and Warning Letters in 2023. The most common reason for an FDA 483 inspection observation is related to the CAPA process (i.e., 336 observations citing 21 CFR 820.100). For the CAPA process, all 336 observations cited problems with inadequate procedures or inadequate records.

Reasons for complaints handling mistakes

The complaints handling process is the second most common reason for FDA 483 inspection observations (i.e., 276 observations citing 21 CFR 820.198). The complaints handling process has nine different reasons for 483 inspection observations (listed from most common to least common):

  1. Procedures for receiving, reviewing, and evaluating complaints by a formally designated unit have not been [adequately] established.  Specifically,*** 
  2. Complaints involving the possible failure of [a device] [labeling] [packaging] to meet any of its specifications were not [reviewed] [evaluated] [investigated] where necessary. Specifically, *** 
  3. Records of complaint investigations do not include required information.  Specifically, *** 
  4. Complaint files are not [adequately] maintained.  Specifically, *** 
  5. Not all complaints have been [adequately] reviewed and evaluated to determine whether an investigation is necessary. Specifically, ***
  6. Records for complaints where no investigation was made do not include required information.  Specifically, *** 
  7. Complaints representing events that are MDR reportable were not [promptly reviewed, evaluated, and investigated by a designated individual] [maintained in a separate portion of the complaint files] [clearly identified]. Specifically, ***
  8. Investigation records of MDR reportable complaints do not include required information.  Specifically, *** 
  9. Records of complaint investigations do not include required information, including any unique device identifier (UDI) or universal product code (UPC).  Specifically, ***

Reasons for FDA 483s related to Medical Device Reporting

There were 106 observations related to medical device reporting (i.e., 21 CFR 803) in 2023 thus far. There are 25 different reasons identified by the FDA for 483 inspection observations related to the Medical Device Reporting regulation. The majority o of the inspection observations were related to an inadequate or missing MDR procedure. However, there were also a number of inspection observations that were related to missing information in the MDR records. Therefore, we updated our Medical Device Reporting Procedure to include all of the required elements of the FDA’s MedWatch Form. We posted a blog about “Where to Focus your Medical Device Complaint Handling Training.” In that blog we answered questions from device manufacturers and consultants regarding the process of complaints handling investigation. The following section is a summary of my responses to those questions.

Complaints handling investigations

What criteria do you think should be used to determine whether a complaint should be investigated or not?

There is only one acceptable rationale for not investigating a complaint. If you don’t investigate complaints when required, then you might receive an FDA Form 483 observation worded like this…

21 CFR 820.198(c) – Complaints involving the possible failure of labeling to meet any of its specifications were not investigated where necessary. Specifically, a missing IFU was reported in customer complaints, but no investigation was conducted. The rationale documented in the complaint record was “the missing IFU presented no patient risk.”

A missing IFU is a “failure of labeling to meet any of its specifications.” Therefore, 21 CFR 820.198(c) requires you to conduct an investigation “unless such investigation has already been performed for a similar complaint, and another investigation is not necessary.” This is the only rationale that is acceptable for skipping your investigation. To ensure that no one forgets to investigate a complaint, make sure you include a space in your complaint handling form that is specifically labeled as “Summary of Complaint Investigation.” This space should also include an option to cross-reference to a previous complaint record where a similar investigation is already documented.

A missing IFU is also considered a misbranded product that requires correction (e.g., sending the customer a replacement IFU) or removal (i.e., recall). The FDA expects a Health and Hazard Evaluation (HHE) form to be included in your recall records, and the HHE should indicate the potential risk of a “delay in treatment.” This is the FDA’s conclusion in their evaluation of risk, and therefore your HHE must identify a delay in treatment as a patient risk too. The FDA also expects a CAPA to be initiated to prevent the recurrence of this type of labeling error. You can make a “risk-based” determination that reporting a specific recall to the FDA is not required as per 21 CFR 806.20. However, you need to maintain records of your determination not to report a recall. If you already received a Warning Letter, you should err on the side of reporting anyway.

Note: References to “recall” in the above paragraph are meant to include field corrections.

Intended Use

If a complaint consists of a medical device being used for something other than its intended use, is an MDR required for this user error?

The answer is yes. If you don’t report adverse events involving “user error,” then you might receive an FDA Form 483 observation worded like this…

21 CFR 803.17(a)(1) – The written MDR procedure does not include an internal system which provides for the timely and effective evaluation of events that may be subject to medical device reporting requirements.  Specifically, several incidents where a death or serious injury occurred were “caused by a user error,” and the procedure did not identify this as an event requiring Medical Device Reporting.

In 21 CFR 803.3, the FDA defines “caused or contributed” to include events occurring as a result of:

  1. Failure
  2. Malfunction
  3. Improper or inadequate design
  4. Manufacture
  5. Labeling, or
  6. User error

It is important to understand that the definition of complaints and the requirement to report adverse events should not be “risk-based.” The need for remediation and the need to report corrections and removals can be “risk-based,” but whether something is a complaint, and whether it is reportable should be “black-and-white.” For example, “Did the death or serious injury occur due to auser error’-including use other than the intended use?” If the answer is yes, then it is a complaint and reportable.

Incidents and Adverse Event Reporting

Do incidents that occurred outside the United States need to be reported to FDA?

The answer is yes. If you don’t report adverse events that occur outside the United States, then you might receive an FDA Form 483 observation worded like this…

21 CFR 820.50(a)(1) – An MDR report was not submitted within 30 days of receiving or otherwise becoming aware of information that reasonably suggests that a marketed device may have caused, or contributed to, a death or severe injury. Specifically, several instances were identified where the device caused or contributed to a death or serious injury, and the event was not reported to the Agency. The rationale documented in the complaint record was that the “event occurred outside the United States.”

This type of mistake is most likely due to a lack of training on 21 CFR 803–Medical Device Reporting. Some manufacturers that distribute products internationally are more familiar with the European Vigilance requirements as defined in Articles 87-89 of Regulation (EU) 2017/745. You can find additional guidance on vigilance reporting in our Vigilance Procedure or the applicable MDCG guidance. The European Medical Device Directive (i.e., MDD) only required vigilance reporting of incidents that occurred outside the Member States of the European Economic Area (EEA), Switzerland, and Turkey if the incident required implementation of field safety corrective actions. The EU MDR now requires reporting of incidents that occur outside of the EU if the device is also made available in the EU.

The FDA Part 803 requirements are worded differently. Part 803 does not indicate that the event had to occur in the United States. The MedWatch form (i.e., FDA Form 3500A) must be filed for events that occur in the United States and events occurring outside the USA if the devices are “similar” to devices marketed in the USA. Unfortunately, most device manufacturers are not aware of this requirement. Therefore, the FDA released a final guidance on Medical Device Reporting requirements on November 8, 2016. If you would like to learn more about Medical Device Reporting requirements, you can purchase our MDR procedure and webinar bundle. We will also be expanding our consulting services in January 2024 to include Medical Device Reporting for our clients.

Additional Resources on Complaints Handling

Medical Device Academy sells a complaints handling procedure, and a webinar on complaints handling. We will be updating the procedure during the holidays and hosting a new live complaints handling webinar on January 4, 2024. If you purchase the webinar, or you purchased the webinar in the past, you will receive an invitation to participate in the live webinar in January. If you purchase the complaints handling procedure, or you purchased the procedure in the past, you will receive the updated procedure, updated complaints handling form, and updated complaints log. You will also receive an invitation to the live webinar because we will be bundling the webinar with an updated procedure. We will also provide a discount code during the live webinar for people to upgrade their purchase of the webinar to include the purchase of the procedure. Customers who purchased one of our turnkey quality systems will also receive access to the live webinar.

Complaints handling mistakes – Why? Read More »

eSTAR Project Management

Using the new FDA eSTAR template also requires a new process for eSTAR project management to prepare your 510k and De Novo submissions.

Outline of ten (10) major changes resulting from the new FDA eSTAR template

As of October 1, 2023, all 510k and De Novo submissions to the FDA now require using the new FDA eSTAR template and the template must be uploaded to the FDA Customer Collaboration Portal (CCP). Yesterday the FDA published an updated guidance explaining the 510k electronic submission requirements, but there are ten (10) major changes to Medical Device Academy’s submission process resulting from the new eSTAR templates:

  1. We no longer need a table of contents.
  2. We no longer use the volume and document structure.
  3. We are no longer required to conform to sectioning or pagination of the entire submission.
  4. We no longer worry about the RTA screening or checklist (it doesn’t exist).
  5. We no longer bother creating an executive summary (it’s optional).
  6. We no longer have a section for Class 3 devices, because there are no Class 3 510(k) devices anymore.
  7. We no longer use FDA Form 3514, because that content is now incorporated into the eSTAR.
  8. We no longer create a Declaration of Conformity, because the eSTAR creates one automatically.
  9. We no longer recommend creating a 510(k) Summary, because the eSTAR creates one automatically
  10. We no longer use FedEx, because we can upload to FDA CCP electronically instead.

What is different in the 510k requirements?

Despite all the perceived changes to the FDA’s pre-market notification process (i.e., the 510k process), the format and content requirements have not changed much. The most significant recent change to the 510k process was the requirement to include cybersecurity testing.

Outline of eSTAR Project Management

There were 20 sections in a 510k submission. Medical Device Academy’s consulting team created a template for the documents to be included in each section. eSTAR project management is different because there are no section numbers to reference. To keep things clear, we recommend using one or two words at the beginning of each file name to define the section it belongs in. The words should match up with the bookmarks used by the FDA. However, you should be careful not to make the file names too long. Below is a list of all of the sections:

The Benefit, Risks, and Mitigation Measures Section only applies to De Novo Classification Requests. The Quality Management Section includes subsections for Quality Management System Information, Facility Information, Post-Market Studies, and References. However, only the References subsection will be visible in most submissions because the other three subsections are part of the Health Canada eSTAR pilot. Other sections and subsections will be abbreviated or hidden depending on the dropdown menu selections you select in the eSTAR. For example, the cybersecurity section will remain hidden if your device does not have wireless functionality or a removable storage drive.

Wireless Not Applicable 1024x252 eSTAR Project Management

A Table of Contents is no longer required for 510k submissions

510k submissions using the FDA eCopy format required a Table of Contents, and Medical Device Academy used the Table of Contents as a project management tool. Sometimes, we still use our Table of Contents template to communicate assignments and manage the 510k project. The sections of the Table of Contents would also be color-coded green, blue, yellow, and red to communicate the status of each section. FDA eSTAR project management uses a similar color coding process with colored bars on the side of the template to indicate if the section is incomplete, complete, or optional.

Color coding of eSTAR 1024x372 eSTAR Project Management

The eSTAR also has a verification section at the end of the template to help with eSTAR project management. The verification section lists each of the 13 major sections of an FDA eSTAR. When the sections are completed, the section’s name automatically moves from the right side of the verification section to the left side. During the past two years (2021 – 2023) of implementing the eSTAR template, I have slowly learned to rely only on the eSTAR to communicate the status of each section. To assign responsibilities for each section of the 510k submission, we still use the Table of Contents simple lists and project management tools like Asana. Using the eSTAR verification section to check on the status of each 510k section also increases our team’s proficiency with the eSTAR every time we use it.

Verification section 1024x379 eSTAR Project Management

Using Dropbox for eSTAR project management

PreSTAR templates for a Q-Sub meeting are approximately half the length (i.e., 15 pages instead of 30+ pages) of an eSTAR template, and the 510k submission requires far more attachments than a Q-Sub. Therefore, we can usually email a revised draft of the PreSTAR to a team member for review, but we can’t use email to share a nearly complete eSTAR with a team member. Therefore, Medical Device Academy uses Dropbox to share revisions of the eSTAR between team members. Some of our clients use One Drive or Google Drive to share revisions. We also create sub-folders for each type of testing. This keeps all of the documents and test reports for a section of the eSTAR in one place. For example, the software validation documentation will be organized in one sub-folder of the Dropbox folder for a 510k project.

When using FDA eCopies instead of the FDA eSTAR template, we used twenty subfolders labeled and organized by volume numbers 1-20. Some of those 20 sections are now obsolete (e.g., Class III Summary), and others (e.g., Indications for Use) are integrated directly into the eSTAR template. Therefore, a team may only need 8-10 sub-folders to organize the documents and test reports for a 510k project. We typically do not attach these documents and test reports until the very end of the submission preparation because if the FDA releases a new version of the eSTAR, the attachments will not export from an older version of the eSTAR to the new version.

Coordination of team collaboration is critical to successful eSTAR project management

In the past, Medical Device Academy always used a volume and document structure to organize an FDA eCopy because this facilitated multiple team members simultaneously working on the same 510k submission–even from different countries. Many clients will use SharePoint or Google Docs to facilitate simultaneous collaboration by multiple users. Unfortunately, the eSTAR cannot be edited by two users simultaneously because it is a secure template that can only be edited in Adobe Acrobat Pro. Therefore, the team must communicate when the eSTAR template is being updated and track revisions. For communication, we use a combination of instant messenger apps (e.g., Slack or Whatsapp) and email, while revisions are tracked by adding the initials and date of the editor to the file name (e.g., nIVD 4.3 rvp 12-5-2023.pdf).

Importance of peer reviews

Each section of the FDA eSTAR must be completed before the submission can be uploaded to the Customer Collaboration Portal (CCP). If the FDA eSTAR is incomplete, the CCP will identify the file as incomplete. You will not be able to upload the file. If questions in the eSTAR are incorrectly answered, then sections that should be completed may not be activated because of how the questions were answered. Below are two examples of how the eSTAR questions can be incorrectly answered.

  • Example 1 – One of the helpful resource features of the FDA eSTAR is that many fields are populated with a dropdown menu of answers. One example is found in the Classification section of the eSTAR. This section requires the submitter to identify the device’s classification by answering three questions: 1) review panel, 2) classification regulation, and 3) the three-letter product code. Each of these fields uses a dropdown menu to populate the field, and the dropdown options for questions two and three depend on answers to the previous question. However, if you manually type the product code into the field for the third question, then the eSTAR will not identify any applicable special controls guidance documents for your device. Unless you are already aware of an applicable special controls guidance document, you will answer questions in the eSTAR about special controls with “N/A.” The eSTAR will only identify a special controls guidance document for your device if you select a product code from the dropdown menu, but the FDA reviewer knows which special controls guidance documents are applicable. This is why the FDA performs a technical screening of the eSTAR before the substantive review begins.

Classification Section 1024x612 eSTAR Project Management

  • Example 2 – If you indicate the cumulative duration of contact for an externally communicating device < 24 hours, the eSTAR template will expect you to evaluate the following biocompatibility endpoints:  cytotoxicity, sensitization, irritation, systemic toxicity, and pyrogenicity.

24 hour duration of contact 1024x118 eSTAR Project Management

However, if you indicate the cumulative duration of contact is  < 30 days, the eSTAR template will be populated with additional biocompatibility endpoints. The eSTAR doesn’t know what the cumulative duration of use is, but the FDA reviewer will. This is why the FDA performs a technical screening of the eSTAR before the substantive review begins.

30 day duration of contact 1024x152 eSTAR Project Management

To make sure that all of the sections of your submission are complete, it’s helpful to have a second person review all of the answers to make sure that everything was completed correctly. Even experienced consultants who prepare 510k submissions every week can make a mistake and incorrectly answer a question in one of the eSTAR fields. Therefore, you shouldn’t skip this critical QC check.

Additional 510k Training

The 510k book, “How to Prepare Your 510k in 100 Days,” was completed in 2017, but the book is only available as part of our 510k course series consisting of 58+ webinars. Please visit the webinar page to purchase individual webinars.

eSTAR Project Management Read More »

510k Electronic Submission Guidance for FDA 510k Submissions

This is an overview of the updated 510k electronic submission guidance document that the FDA released on October 2, 2023.

What’s included in the 510k electronic submission guidance?

As with any FDA guidance, there is an introduction and background regarding the reason for the updated guidance document (i.e., eSTAR guidance). At the very beginning of the document (i.e., page 3) the reference to the RTA Guidance was deleted, because there is no longer an RTA screening process with the implementation of the FDA eSTAR templates. The updated guidance explains on page 6 that “The CDRH Portal will automatically verify that the eSTAR is complete, and therefore we do not expect to receive incomplete 510(k) eSTARs.” In the scope section, the FDA specifies that this document is specific to 510k submissions using the eSTAR template. The document also explains that CBER conducted a pilot with the eSTAR template in June 2022 and now the FDA eSTAR template must be used in conjunction with the CDRH Portal for submission of a 510k to CBER. The FDA has plans to release a similar De Novo submission guidance for using the eSTAR template, but this has not happened in the year since the FDA announced the intention to do so. In the “Significant Terminology” section of the guidance (i.e., Section IV), the FDA provides definitions for each of the different types of submissions: eCopy, eSubmitter, etc. In the “Current Electronic Submission Template Structure, Format, and Use” section of the guidance (i.e., Section V), the FDA modified the term used for the company that is applying for 510k clearance from “Submitter” to “Applicant,” because sometimes a regulatory consultant or 3rd party reviewer is submitting the 510k on behalf of the applicant. On page 12 of the updated guidance, the FDA added “Withdrawal requests” to the list of 510k submissions/information that is exempt from the 510k electronic submission requirements. In the next to last section of the electronic submission guidance, the FDA provides a table outlining all of the sections of the new eSTAR template. The table is reproduced later in this article. If you are interested in a tutorial on completing each section outlined in the table, we recommend purchasing Medical Device Academy’s 510(k) Course. The last section of the eSTAR guidance indicates the timing for compliance with the updated guidance (i.e., October 1, 2023).

Revisions to the FDA eSubmissions Guidance 10 2 2023 1024x620 510k Electronic Submission Guidance for FDA 510k Submissions

What is the deadline for compliance with the guidance?

The deadline has now passed. The new eSTAR template must be used for all 510k and De Novo submissions as of October 1, 2023. You must upload the new FDA eSTAR submissions using the CDRH Portal. You will need to request an account using a registration hyperlink.

What’s missing from this 510k submission guidance?

The updated 510k electronic submission guidance does not provide information regarding the receipt date for electronic submissions made through the new customer collaboration portal (CCP) created by CDRH. The image below is a screen capture of the current CCP upload webpage. It includes the following statement, “Send your submission before 16:00 ET on a business day for us to process it the same day.” This statement was added sometime in August or September, but the FDA has not released a detailed explanation. This statement makes it clear that the FDA is not promising to process a submission the “same day” if the submission is received after 4:00 p.m. ET. However, “processed” does not have the same meaning as “receipt date.”

Another element missing from this updated guidance is a reference to human factors documentation. For any devices that have a user interface that is different from the predicate device, and for software devices, the FDA requires documentation of your human factors process to make sure that differences in the user interface do not result in new or different risks when compared to the predicate device. The 2016 FDA guidance for human factors has not been updated, but FDA reviewers continue to issue deficiencies related to the objective evidence provided in a 510k for human factors validation.

CCP screen capture 1024x619 510k Electronic Submission Guidance for FDA 510k Submissions

The FDA must be consistent in the wording for “Hours for Receipt of Submission” because this affects submissions at the end of the fiscal year, but it also affects any submissions with a deadline for response to an RTA Hold, AI Response, and IDE submissions. The CDER and CBER divisions of the FDA address the need for defining the date of receipt in a guidance document specific to this topic, “Providing Regulatory Submissions in Electronic Format–Receipt Date.” Below is a screen capture copied from page 4 of the guidance.

Electronic Submission 510k Electronic Submission Guidance for FDA 510k Submissions

Another element missing from this new guidance is a reference to human factors documentation. For any devices that have a user interface that is different from the predicate device, and for software devices, the FDA requires documentation of your human factors process to make sure that differences in the user interface do not result in new or different risks when compared to the predicate device. The 2016 FDA guidance for human factors has not been updated, but FDA reviewers continue to issue deficiencies related to the objective evidence provided in a 510k for human factors validation.

What are the new sections for a 510k submission?

In 2019, the FDA released a guidance document on the “Format of Traditional and Abbreviated 510(k)s.” That guidance outlines the 20 sections of a traditional 510k submission that have been used for decades. However, the new 510k electronic submission guidance has no numbering for the sections of the eSTAR template, and there are 22 sections instead of 20 sections. Several of the new sections are elements of the current FDA submission cover sheet (i.e., FDA Form 3514), and some sections exist in the 2019 guidance that were eliminated, such as: “Class III Summary and Certification.” Therefore, Medical Device Academy is recreating 100% of our 510k training webinars to explain how our 510k templates are used with the 510k eSTAR template and how to fill in the PDF form. To prevent confusion between the two formats, we are using letters for each section in the eSTAR template instead of numbers (i.e., A-V instead of 1-20). Table 1 from the new eSTAR guidance is reproduced below for your information.

Information Requested Description
A Submission Type Identification of key information that may be useful to FDA in the initial processing and review of the 510(k) submission, including content from current Form FDA 3514, Section A.
B Cover Letter / Letters of Reference Attach a cover letter and any documents that refer to other submissions.
C Submitter Information Information on submitter and correspondent, if applicable, consistent with content from current Form FDA 3514, Sections B and C.
D Pre-Submission Correspondence & Previous Regulator Interaction Information on prior submissions for the same device included in the current submission, such as submission numbers for a prior not substantially equivalent (NSE) determination, prior deleted or withdrawn 510(k), Q-Submission, Investigational Device Exemption (IDE) application, premarket approval (PMA) application, humanitarian device exemption (HDE) application, or De Novo classification request.
E Consensus Standards Identification of voluntary consensus standard(s) used, if applicable. This includes both FDA-recognized and nonrecognized consensus standards.
F Device Description Identification of listing number if listed with FDA.Descriptive information for the device, including a description of the technological characteristics of the device including materials, design, energy source, and other device features, as defined in section 513(i)(1)(B) of the FD&C Act and 21 CFR 807.100(b)(2)(ii)(A). Descriptive information also includes a description of the principle of operation for achieving the intended effect and the proposed conditions of use, such as surgical technique for implants; anatomical location of use; user interface; how the device interacts with other devices; and/or how the device interacts with the patient.Information on whether the device is intended to be marketed with accessories.

Identification of any applicable device-specific guidance document(s) or special controls for the device type as provided in a special controls document (or alternative measures identified that provide at least an equivalent assurance of safety and effectiveness) or in a device-specific classification regulation, and/or performance standards. See “The 510(k) Program: Evaluating Substantial Equivalence in Premarket Notifications [510(k)].”

G Proposed Indications for Use (Form FDA 3881) Identification of the proposed indications for use of the device. The term indications for use, as defined in 21 CFR 814.20(b)(3)(i), describes the disease or condition the device will diagnose, treat, prevent, cure, or mitigate, including a description of the patient population for which the device is intended.
H Classification Identification of the classification regulation number that seems most appropriate for the subject device, as applicable.
I Predicates and Substantial Equivalence Identification of a predicate device (e.g., 510(k) number, De Novo number, reclassified PMA number, classification regulation reference, if exempt and limitations to exemption are exceeded, or statement that the predicate is a preamendments device).The submission should include a comparison of the predicate and subject device and a discussion why any differences between the subject and predicate do not impact safety and effectiveness [see section 513(i)(1)(A) of the FD&C Act and 21 CFR 807.87(f)]. A reference device should also be included in the discussion, if applicable. See “The 510(k) Program: Evaluating Substantial Equivalence in Premarket Notifications [510(k)].”
J Design/Special Controls, Risks to Health, and Mitigation Measures Applicable to Special 510(k) submissions only.Identification of the device changes and the risk analysis method(s) used to assess the impact of the change(s) on the device and the results of the analysis.Risk control measures to mitigate identified risks (e.g., labeling, verification). See “The Special 510(k) Program.”
K Labeling Submission of proposed labeling in sufficient detail to satisfy the requirements of 21 CFR 807.87(e). Generally, if the device is an in vitro diagnostic device, the labeling must also satisfy the requirements of 21 CFR 809.10. Additionally, the term “labeling” generally includes the device label, instructions for use, and any patient labeling. See “Guidance on Medical Device Patient Labeling.
L Reprocessing Information for assessing the reprocessing validation and labeling, if applicable. See “Reprocessing Medical Devices in Health Care Settings: Validation Methods and Labeling.
M Sterility Information on sterility and validation methods, if applicable. See “Submission and Review of Sterility Information in Premarket Notification (510(k)) Submissions for Devices Labeled as Sterile.
N Shelf Life Summary of methods used to establish that device performance is maintained for the entirety of the proposed shelf-life (e.g., mechanical properties, coating integrity, pH, osmolality), if applicable.
O Biocompatibility Information on the biocompatibility assessment of patient contacting materials, if applicable. See “Use of International Standard ISO 10993-1, ‘Biological evaluation of medical devices – Part 1: Evaluation and testing within a risk management process.’”
P Software/Firmware Submission of applicable software documentation, if applicable. See “Guidance for the Content of Premarket Submissions for Software Contained in Medical Devices.
Q Cybersecurity/Interoperability Submission of applicable information regarding the assessment of cybersecurity, if applicable. See “Content for Premarket Submissions for Management of Cybersecurity in Medical Devices” and “Design Considerations and Premarket Submission Recommendations for Interoperable Medical Devices.
R Electromagnetic Compatibility (EMC), Electrical, Mechanical, Wireless and Thermal Safety Submission of the EMC, Electrical, Mechanical, Wireless and Thermal Safety testing for your device or summarize why testing is not needed. See “Electromagnetic Compatibility (EMC) of Medical Devices” and “Radio Frequency Wireless Technology in Medical Devices.
S Performance Testing For non-in vitro diagnostic devices: Provide information on the non-clinical and clinical test reports submitted, referenced, or relied on in the 510(k) for a determination of substantial equivalence. See “Recommended Content and Format of NonClinical Bench Performance Testing Information in Premarket Submissions.”For in vitro diagnostic devices: Provide analytical performance, comparison studies, reference range/expected values, and clinical study information.
T References Inclusion of any literature references, if applicable.
U Administrative Documentation Inclusion of additional administrative forms applicable to the submission, including but not limited to a general summary of submission/executive summary (recommended), a Truthful and Accuracy Statement, and a 510(k) Summary or statement.
V Amendment/Additional Information (AI) response Inclusion of responses to Additional Information requests.

Important information in the eSTAR guidance

In Table 1 above, there are 14 hyperlinks to various FDA guidance documents. These links are extremely helpful when you have questions about a specific question. Unfortunately, the 510k electronic submission guidance document will quickly become out-of-date as guidance documents are updated and made obsolete. In particular, one of the A-list final guidance documents that was planned for FY 2023 was the FDA cybersecurity guidance. The updated cybersecurity guidance was finally released last week.

510k Electronic Submission Guidance for FDA 510k Submissions Read More »

Hiring an Auditor

In this article, you will learn how to hire an auditor to conduct medical device internal audits and supplier audits.

help wanted Hiring an Auditor
Stop begging people to help you audit. Learn how to recruit auditors more effectively.

Hiring and Auditor

Hiring an auditor, whether as a consultant or a permanent team member, is a critical decision that can drastically improve your quality management system and foster a culture of quality, or it can add no value and lead to disruption and frustration.  The purpose of this blog is to identify the qualities and training that make the best auditor to help you elevate your internal audit programs.  

Audit Program Structures 

Companies typically take one of the following approaches to address their internal audit requirements: 

  1. Train internal personnel with other primary functions as auditors and have them audit other departments.
  2. Hire an independent 3rd party to conduct the internal audits.
  3. Build an internal audit team that is independent of all other processes.  

Hiring an Auditor from Within 

Option 1 is common across the industry and is a personnel-efficient means of achieving the audit objectives. While this type of approach can sometimes be effective and may satisfy the basic requirement to conduct internal audits, there can be some drawbacks to this structure. Sometimes, when people were not hired specifically to be auditors and auditing is something they were asked to do in addition to their regular job, there is little to no motivation to develop auditing skills, and the audits lack a depth and thoroughness that ultimately reduces the value of the audit program. Proper internal recruiting and training of these auditors is crucial to ensuring audits are a useful value-added exercise and not a box-checking chore. 

To successfully recruit internal auditors serving in other roles, it’s important to motivate people to want to be an auditor. Let potential recruits know that employees with audit experience are more valuable to companies than those without. It exposes employees to upstream or downstream processes to better understand the overall operations and provides them the opportunity to make process improvements in both directions to their workflow. If you want to be effective and get promoted, you need to demonstrate value to your boss and top management. If you don’t understand what other departments need, how can you help them? No manager will promote a selfish, power-hungry hog. They promote team players that make others better. Auditing gives you the insight necessary to understand how you can do that.  

Once motivated and recruited, it’s important to ensure these employees have the skills and resources to be successful as auditors. To help develop their skills, training on audit processes and the responsibilities and role of an auditor in accordance with ISO 19011 will provide guidance on conducting audits and the basics of how to audit. Auditors should also be trained against the specific standard or regulation they are auditing against, which may include ISO 13485, 21 CFR 820, ISO 14971, EU MDR, and others. Resources that will support their activities may include process audit diagrams, checklists, examples of record requests, strategies for intelligent sample selection, and, of course, a clear definition of the regulatory and procedural requirements of the process that they are auditing.  

If you are looking for support in training your own employees to be internal auditors, we would be happy to outline or provide a training program specific to your company’s processes and products to ensure your auditors are competent and effective in their new role.  

Hiring a 3rd Party Auditor  

Option 2 can be useful to any company, but selecting the right auditor is essential to the success of this approach. The basic qualifications and qualities that I recommend companies look for when hiring an outside auditor are: 

  1. Experience – this includes industry experience and regulatory knowledge. An auditor with experience auditing or working for a company with similar devices, manufacturing processes, etc., will provide more value than an unfamiliar auditor. Regulatory knowledge and experience within your targeted markets are also important to evaluate to ensure that they are familiar with the standards and regulations against which they will be auditing.  
  1. Communication Skills – This is a make-or-break quality of auditors that can shift the substance of an audit from a value-added exercise to a disrupting and frustrating experience. You want to ensure that auditors are affable yet confident, able to communicate the usefulness of the audit for the purpose of process improvement and facilitate a productive dialogue, offering education and suggestions when issues or nonconformances arise.   
  1. Reputation and References – ask the auditor for references from previous clients. Contact the references to get feedback on their performance, reliability, and professionalism. This is a great way to evaluate an auditor’s communication skills and whether previous auditees gained value from the interaction.  
  1. Auditor Training – acceptable qualifications for an auditor can be defined by the company but may include lead auditor certification, demonstrated training on relevant standards with experience shadowing experienced auditors, and documented training on other relevant standards/regulations.  
  1. Audit Methodology – Inquire about how auditors plan, execute, and report on audits. What audit methodology does the auditor prefer for the scope of your audit, and why?  

There are many companies and consultants that offer 3rd party auditor services, but not all are created equal. Like the CAPA process, the internal audit program is a window into the culture surrounding quality that your company has, and by demonstrating that you are proactively policing yourself and seeing continuous improvement through an effective internal audit program will show regulators that your company has a commitment to quality.  

Hiring a Full-time Audit Team 

Option 3 is generally reserved for the resource-rich industry with operations that demand expansive continuous audit processes to justify the support of a full-time auditor or audit team. Hiring your own team benefits from the same considerations that come with hiring a 3rd party auditor; the ability for the auditors to become intimately familiar with the company, devices, and processes is valuable. For companies that do not have the need for full-time auditors, the same value of familiarity can come from building a trusted relationship with a third-party auditor or audit team, who can support your audit program year after year.  

Hiring an Auditor from Medical Device Academy  

Our goal at Medical Device Academy is to help you improve your quality system and provide valuable consulting advice to achieve improvements. We specialize in helping start-up companies achieve initial ISO 13485 certification, MDSAP certification, and CE Certification. Based on the scope of the audit and medical device, we will assign the most qualified team member. Some of our specific areas of expertise include auditing companies with manufacturing and machining, aseptic processing, agile software development, sterile products, medical device reprocessors, 3D printed manufacturing, and more. If you are interested in outsourcing any supplier or internal audit activities, you can check out our Audit Services page to get in touch or to learn more about our audit team.

Hiring an Auditor Read More »

Predicate selection guidance proposes controversial additions

The FDA released a new draft 510k predicate selection guidance on September 7, but the draft guidance proposes controversial additions.

Draft Guidance on Predicate Selection Best Practices 1024x664 Predicate selection guidance proposes controversial additions
Download the Draft FDA Predicate Selection Guidance

On September 7, 2023, a draft predicate selection guidance document was released by the FDA. Normally the release of a new draft of FDA guidance documents is anticipated and there is an obvious need for the draft. This new draft, however, appears to include some controversial additions that I feel should be removed from the guidance. This specific guidance was developed to help submitters use best practices in selecting a predicate. There is some useful advice regarding the need to review the FDA database for evidence of use-related and design-related safety issues associated with a potential predicate that is being considered. Unfortunately, the last section of the guidance suggests some controversial recommendations that I strongly disagree with.

Please submit comments to the FDA regarding this draft guidance

This guidance is a draft. Your comments and feedback to the FDA will have an impact on FDA policy. We are preparing a redlined draft of the guidance with specific comments and recommended changes. We will make the comments and feedback available for download from our website on our predicate selection webinar page. We are also creating a download button for the original draft in Word (.docx) format, and sharing the FDA instructions for how to respond.

Section 1 – Introduction to the guidance

The FDA indicates that this new draft predicate selection guidance document was created to provide recommendations to implement four (4) best practices when selecting a predicate device to support a 510k submission. This first objective is something that our consulting firm recommended in a training webinar. The guidance indicates that the guidance was also created by the FDA in an attempt to improve the predictability, consistency, and transparency of the 510k pre-market review process. This second objective is not accomplished by the draft guidance and needs to be modified before the guidance is released as a final guidance.

Section 2 – Background

This section of the guidance is divided into two parts: A) The 510k Process, and B) 510k Modernization.

A. The 510k Process

The FDA released a Substantial Equivalence guidance document that explains how to demonstrate substantial equivalence. The guidance document includes a new decision tree that summarizes each of the six questions that 510k reviewers are required to answer in the process of evaluating your 510k submission for substantial equivalence. The evidence of substantial equivalence must be summarized in the Predicates and Substantial Equivalence section of the FDA eSTAR template in your 510k submission, and the guidance document reviews the content that should be provided.

Substantial equivalence is evaluated against a predicate device or multiple predicates. To be considered substantially equivalent, the subject device of your 510k submission must have the same intended use AND the same technological characteristics as the predicate device. Therefore, you cannot use two different predicates if one predicate has the same intended use (but different technological characteristics), and the second predicate has the same technological characteristics (but a different intended use). That’s called a “split predicate,” and that term is defined in the guidance This does not prohibit you from using a secondary predicate, but you must meet the requirements of this guidance document to receive 510k clearance. The guidance document reviews five examples of multiple predicates being used correctly to demonstrate substantial equivalence.

B. 510k Modernization

The second part of this section refers to the FDA’s Safety Action Plan issued in April 2018. The announcement of the Safety Action Plan is connected with the FDA’s announcement of actions to modernize the 510k process as well. The goals of the FDA Safety Action Plan consist of:

  1. Establish a robust medical device patient safety net in the United States
  2. Explore regulatory options to streamline and modernize timely implementation of postmarket mitigations
  3. Spur innovation towards safer medical devices
  4. Advance medical device cybersecurity
  5. Integrate the Center for Devices and Radiological Health’s (CDRH’s) premarket and postmarket offices and activities to advance the use of a TPLC approach to device safety 

Examples of modernization efforts include the following:

  • Conversion of the remaining Class 3 devices that were designated for the 510k clearance pathway to the PMA approval process instead
  • Use of objective performance standards when bringing new technology to the market
  • Use of more modern predicate devices (i.e., < 10 years old)

In this draft predicate selection guidance, the FDA states that feedback submitted to the docket in 2019 has persuaded the FDA to acknowledge that focusing only on modern predicate devices may not result in optimal safety and effectiveness. Therefore, the FDA is now proposing the approach of encouraging best practices in predicate selection. In addition, the draft guidance proposes increased transparency by identifying the characteristics of technological characteristics used to support a 510k submission.

The FDA did not mention an increased emphasis on risk analysis or risk management in the guidance, but the FDA is modernizing the quality system regulations (i.e., 21 CFR 820) to incorporate ISO 13485:2016 by reference. Since ISO 13485:2016 requires the application of a risk-based approach to all processes, the application of a risk-based approach will also impact the 510k process in multiple ways, such as design controls, supplier controls, process validation, post-market surveillance, and corrective actions. 

Section 3 – Scope of the predicate selection guidance

The draft predicate selection guidance indicates that the scope of the guidance is to be used in conjunction with the FDA’s 510k program guidance. The scope is also not intended to change to applicable statutory or regulatory standards. 

Section 4 – How to use the FDA’s predicate selection guidance

The FDA’s intended use of the predicate selection guidance is to provide submitters with a tool to help them during the predicate selection process. This guidance suggests a specific process for predicate selection. First, the submitter should identify all of the possible legally marketed devices that also have similar indications for use. Second, the submitter should exclude any devices with different technological characteristics if the differences raise new or different issues of risk. The remaining sub-group is referred to in the guidance as “valid predicate device(s).” The third, and final, step of the selection process is to use the four (4) best practices for predicate selection proposed in the guidance. The diagram below provides a visual depiction of the terminology introduced in this guidance.

Visual diagram of terminology in predicate selection guidance 1024x438 Predicate selection guidance proposes controversial additions

Section 5 – Best Practices (for predicate selection)

The FDA predicate selection guidance has four (4) best practices recommended for submitters to use when narrowing their list of valid predicate devices to a final potential predicate(s). Prior to using these best practices, you need to create a list of legally marketed devices that could be potential predicates. The following FDA Databases are the most common sources for generating a list of legally marketed devices:

  • Registration & Listing Database
    • Trade names of similar devices (i.e., proprietary name)
    • Manufacturer(s) of similar devices (i.e., owner operator name)
  • 510k Database
    • 510k number of similar devices
    • Applicant Name (i.e., owner operator name) of similar devices
    • Device Name (i.e., trade name) of similar device
  • Device Classification Database
    • Device classification name of similar devices
    • Product Code of similar devices
    • Regulation Number of similar devices

Our team usually uses the Basil Systems Regulatory Database to perform our searches. Basil Systems uses data downloaded directly from the FDA, but the software gives us four advantages over the FDA public databases:

  1. The search engine uses a natural-language algorithm rather than a Boolean search.
  2. The database is much faster than the FDA databases.
  3. The results include analytics regarding the review timelines and a “predicate tree.”
  4. Basil Systems also has a post-market surveillance database that includes all of the FDA adverse events and recall data, but it also includes access to data from Health Canada and the Australian TGA.

A. Predicate devices cleared using well-established methods

Some 510k submissions use the same methods used by a predicate device that was used for their substantial equivalence comparison, while other devices use well-established methods. The reason for this may have been that the 510k submission preceded the release of an FDA product-specific, special controls guidance document. In other cases, the FDA may not have recognized an international standard for the device classification. You can search for recognized international standards associated with a specific device classification by using the FDA’s recognized consensus standards database. An example is provided below.

How to search for FLL recognized standards 1024x712 Predicate selection guidance proposes controversial additions

FLL recognized standards 1024x577 Predicate selection guidance proposes controversial additions

New 510k submissions should always use the methods identified in FDA guidance documents and refer to recognized international standards instead of copying the methods used to support older 510k submissions that predate the current FDA guidance or recognized standards. The problem with the FDA’s proposed approach is that the FDA is implying that a device that was not tested to the current FDA guidance or recognized standards is inherently not as safe or effective as another device that was tested to the current FDA guidance or recognized standards. This inference may not be true. Therefore, even though this may be a consideration, it is not appropriate to require manufacturers to include this as a predicate selection criterion. The FDA is already taking this into account by requiring companies to comply with the current FDA guidance and recognized standards for device description, labeling, non-clinical performance testing, and other performance testing. An example of how the FDA PreSTAR automatically notifies you of the appropriate FDA special controls guidance for a product classification is provided below.

Screen capture of PreSTAR classification section 1024x792 Predicate selection guidance proposes controversial additions

B. Predicate devices meet or exceed expected safety and performance

This best practice identified in the FDA predicate selection guidance recommends that you search through three different FDA databases to identify any reported injury, deaths, or malfunctions of the predicate device. Those three databases are:

  1. MAUDE Database
  2. MDR Database
  3. MedSun Database

All of these databases are helpful, but there are also problems associated with each database. In general, adverse events are underreported, and a more thorough review of post-market surveillance review is needed to accurately assess the safety and performance of any device. The MAUDE data represents reports of adverse events involving medical devices and it is updated weekly. The data consists of all voluntary reports since June 1993, user facility reports since 1991, distributor reports since 1993, and manufacturer reports since August 1996. The MDR Data is no longer updated, but the MDR database allows you to search the CDRH database information on medical devices that may have malfunctioned or caused a death or serious injury during the years 1992 through 1996. Medical Product Safety Network (MedSun) is an adverse event reporting program launched in 2002 by CDRH. The primary goal for MedSun is to work collaboratively with the clinical community to identify, understand, and solve problems with the use of medical devices. The FDA predicate selection guidance, however, does not mention the Total Product Life Cycle (TPLC) database which is a more efficient way to search all of the FDA databases–including the recall database and the 510k database.

The biggest problem with this best practice as a basis for selecting a predicate is that the number of adverse events depends upon the number of devices used each year. For a small manufacturer, the number of adverse events will be very small because there are very few devices in use. For a larger manufacturer, the number of adverse events will be larger–even though it may represent less than 0.1% of sales. Finally, not all companies report adverse events when they are required to, while some companies may over-report adverse events. None of these possibilities is taken into consideration in the FDA’s draft predicate selection guidance.

C. Predicate devices without unmitigated use-related or design-related safety issues

For the third best practice, the FDA predicate selection guidance recommends that submitters search the Medical Device Safety database and CBER Safety & Availability (Biologics) database to identify any “emerging signals” that may indicate a new causal association between a device and an adverse event(s). As with all of the FDA database searches, this information is useful as an input to the design process, because it helps to identify known hazards associated with similar devices. However, a more thorough review of post-market surveillance review is needed to accurately assess the safety and performance of any device–including searching databases from other countries where similar devices are marketed.

D. Predicate devices without an associated design-related recall

For the fourth best practice, the FDA predicate selection guidance recommends that submitters search the FDA recalls database. As stated above, the TPLC database includes this information for each product classification. Of the four best practices recommended by the FDA, any predicate device that was to a design-related recall is unlikely to be accepted by the FDA as a suitable predicate device. Therefore, this search should be conducted during the design planning phase or while design inputs are being identified. If you are unable to identify another predicate device that was not the subject of a design-related recall, then you should request a pre-submission meeting with the FDA and provide a justification for the use of the predicate device that was recalled. Your justification will need to include an explanation of the risk controls that were implemented to prevent a similar malfunction or use error with your device. Often recalls result from quality problems associated with a supplier that did not make a product to specifications or some other non-conformity associated with the assembly, test, packaging, or labeling of a device. None of these problems should automatically exclude the use of a predicate because they are not specific to the design.

Section 6 – Improving Transparency

This section of the FDA predicate selection guidance contains the most controversial recommendations. The FDA is proposing that the 510k summary in 510k submissions should include a narrative explaining their selection of the predicate device(s) used to support the 510k clearance. This would be a new requirement for the completion of a 510k summary because that information is not currently included in 510k summaries. The new FDA eSTAR has the ability to automatically generate a 510k summary as part of the submission (see example below), but the 510k summary generated by the eSTAR does not include a section for including a narrative explaining the reasons for predicate selection.

Sections of the 510k summary that are automatically populated in the eSTAR 1024x544 Predicate selection guidance proposes controversial additions

The FDA added this section to the draft guidance with the goals of improving the predictability, consistency, and transparency of the 510k pre-market review process. However, the proposed addition of a narrative explaining the reasons for predicate selection is not the best way to achieve those goals. Transparency is best achieved by eliminating the option of a 510k statement (i.e., 21 CFR 807.93). Currently, the 510k process allows for submitters to provide a 510k statement or a 510k summary. The 510k statement prevents the public from gaining access to any of the information that would be provided in a 510k summary. Therefore, if the narrative explaining the reasons for predicate selection is going to be required in a 510k submission, that new requirement should be added to the substantial equivalence section of the eSTAR instead of only including it in the 510k summary. If the 510k statement is eliminated as an option for submitters, then all submitters will be required to provide a 510k summary and the explanation for the predicate selection can be copied from a text box in the substantial equivalence section.

The FDA eSTAR ensures consistency of the 510k submission contents and format, and tracking of FDA performance has improved the consistency of the FDA 510k review process. Adding an explanation for predicate selection will not impact either of these goals for improving the 510k process. In addition, companies do not select predicates only for the reasons indicated in this FDA predicate selection guidance. One of the most common reasons for selecting a predicate is the cost of purchasing samples of predicate devices for side-by-side performance testing. This only relates to cost, not safety or performance, and forcing companies to purchase more expensive devices for testing would not align with the least burdensome approach. Another flaw in this proposed additional information to be included in the 510k summary is that there is a huge variation in the number of predicates that can be selected for different product classifications. For example, 319 devices were cleared in the past 10 years for the FLL product classification (i.e., clinical electronic thermometer), while 35 devices were cleared in the past 10 years for the LCX product classification (i.e., pregnancy test). Therefore, the approach to selecting a predicate for these two product classifications would be significantly different due to the number of valid predicates to choose from. This makes it very difficult to create a predictable or consistent process for predicate selection across all product classifications. There may also be confidential, strategic reasons for predicate selection that would not be appropriate for a 510k summary.

Section 7 – Examples

The FDA predicate selection guidance provides three examples. In each example, the FDA is suggesting that the submitter should provide a table that lists the valid predicate devices and compare those devices in a table using the four best practices as criteria for the final selection. The FDA is positioning this as providing more transparency to the public, but this information presented in the way the FDA is presenting it would not be useful to the public. This is creating more documentation for companies to submit to the FDA without making devices safer or improving efficacy. This approach would be a change in the required content of a 510k summary and introduce post-market data as criteria for 510k clearance. This is a significant deviation from the current FDA policy.

Example 1 from predicate selection guidance

In this example, the submitter included a table in their 510k submission, along with their rationale for selecting one of the four potential predicates as the predicate device used to support their 510k submission. This example is the most concerning because the summary doesn’t have any details regarding the volume of sales for the potential predicates being evaluated. The number of adverse events and recalls is usually correlated with the volume of sales. The proposed table doesn’t account for this information.

Example 2 from predicate selection guidance

In this example, the submitter was only able to identify one potential, valid predicate device. The submitter provided a table showing that the predicate did not present concerns for three of the four best practices, but the predicate was the subject of a design-related recall. The submitter also explained the measures taken to reduce the risk of those safety concerns in the subject device. As stated above, using the occurrence of a recall as the basis for excluding a predicate is not necessarily appropriate. Most recalls are initiated due to reasons other than the design. Therefore, you need to make sure that the reason for the recall is design-related rather than a quality system compliance issue or a vendor quality issue.

Example 3 from predicate selection guidance

In this example, the submitter identified two potential, valid predicate devices. No safety concerns were identified using any of the four best practices, but the two potential devices have different market histories. One device has 15 years of history, and the second device has three years of history. The submitter chose the device with 15 years of history because the subject device had a longer regulatory history. The problem with this approach is that years since clearance is not an indication of regulatory history. A device can be cleared in 2008, but it might not be launched commercially until several years later. In addition, the number of devices used may be quite small for a small company. In contrast, if the product with three years since the 510k clearance is distributed by a major medical device company, there may be thousands of devices in use every year. 

Medical Device Academy’s recommendations for predicate selection

The following information consists of recommendations our consulting firm provides to clients regarding predicate selection.

Try to use only one predicate (i.e., a primary predicate)

Once you have narrowed down a list of predicates, we generally recommend only using one of the options as a primary predicate and avoiding the use of a second predicate unless absolutely necessary. If you are unsure of whether a second predicate or reference device is needed, this is an excellent question to ask the FDA during a pre-submission teleconference under the topic of “regulatory strategy” (see image below). In your PreSTAR you can ask the following question, “[Your company name] is proposing to use [primary predicate] as a primary predicate. A) Does the FDA have any concerns with the predicate selection? B) Does the FDA feel that a secondary predicate or reference device is needed?”

PreSTAR Topic Selection 1024x330 Predicate selection guidance proposes controversial additions

When and how to use multiple predicates

Recently a client questioned me about the use of a secondary predicate in a 510k submission that I was preparing. They were under the impression that only one predicate was allowed for a 510k submission because the FDA considers the two predicate devices to be a “split predicate.” The video provided above explains the definition of a “split predicate,” and the definition refers to more than the use of two predicates. For many of the 510k submissions, we prepared and obtained clearance for used secondary predicates. An even more common strategy is to use a second device as a reference device. The second device may only have technological characteristics in common with the subject device, but the methods of safety and performance testing used can be adopted as objective performance standards for your 510k submission.

When you are trying to use multiple predicate devices to demonstrate substantial equivalence to your subject device in a 510k submission, you have three options for the correct use of multiple predicate devices:

  1. Two predicates with different technological characteristics, but the same intended use.
  2. A device with more than one intended use.
  3. A device with more than one indication under the same intended use.

If you use “option 1”, then your subject device must have the technological characteristics of both predicate devices. For example, your device has Bluetooth capability, and it uses infrared technology to measure temperature, while one of the two predicates has Bluetooth but uses a thermistor, and the other predicate uses infrared measurement but does not have Bluetooth.

If you use “option 2”, you are combining the features of two different devices into one device. For example, one predicate device is used to measure temperature, and the other predicate device is used to measure blood pressure. Your device, however, can perform both functions. You might have chosen another multi-parameter monitor on the market as your predicate, however, you may not be able to do that if none of the multi-parameter monitors have the same combination of intended uses and technological characteristics. This scenario is quite common when a new technology is introduced for monitoring, and none of the multi-parameter monitors are using the new technology yet.

If you use “option 3”, you need to be careful that the ability of your subject device to be used for a second indication does not compromise the performance of the device for the first indication. For example, bone fixation plates are designed for the fixation of bone fractures. If the first indication is for long bones, and the second indication is for small bones in the wrist, the size and strength of the bone fixation plate may not be adequate for long bones, or the device may be too large for the wrist.

Predicate selection guidance proposes controversial additions Read More »

Design Controls Implementation

Design controls can be overwhelming, but you can learn the process using this step-by-step guide to implementing design controls.
Design and development process HD 610x1024 Design Controls ImplementationDesign Controls Implementation

You can implement design controls at any point during the development process, but the earlier you implement your design process the more useful design controls will be. The first step of implementing design controls is to create and design controls procedure. You will also need at least two of the following additional quality system procedures:

  1. Risk Management Procedure (SYS-010)
  2. Software Development and Validation (SYS-044)
  3. Usability Procedure (SYS-048)
  4. Cybersecurity Work Instruction (WI-007)

A risk management file (in accordance with ISO 14971:2019) is required for all medical devices, and usability engineering or human factors engineering (in accordance with IEC 62366-1) is required for all medical devices. The software and cybersecurity procedures listed above are only required for products with 1) software and/or firmware, and 2) wireless functionality or an access point for removable media (e.g., USB flash drive or SD card).

Step 2: Design controls training

Even though the requirement for design controls has been in place for more than 25 years, there are still far too many design teams that struggle with understanding these requirements. Medical device regulations are complex, but design controls are the most complex process in any quality system. The reason for this is that each of the seven sub-clauses represents a mini-process that is equivalent in complexity to CAPA root cause analysis. Many companies choose to create separate work instructions for each sub-clause.

Medical Device Academy’s training philosophy is to distill processes down to discrete steps that can be absorbed and implemented quickly. We use independent forms to support each step and develop training courses with practical examples, instead of writing a detailed procedure(s). The approach we teach removes complexity from your design control procedure (SYS-008). Instead, we rely upon the structure of step-by-step forms completed at each stage of the design process.

If you are interested in design control training, Rob Packard will be hosting the 3rd edition of our Design Controls Training Webinar on Friday, August 11, 2023, @ 9:30 am EDT.

Step 3: Gathering post-market surveillance data

Post-market surveillance is not currently required by the FDA in 21 CFR 820, but it is required by ISO 13485:2016 in Clause 7.3.3c) (i.e., “[Design and development inputs] shall include…applicable outputs(s) of risk management”). The FDA is expected to release the plans for the transition to ISO 13485 in FY 2024, but most companies mistakenly think that the FDA does not require consideration of post-market surveillance when they are designing new devices. This is not correct. There are three ways the FDA expects post-market surveillance to be considered when you are developing a new device:

  1. Complaints and adverse events associated with previous versions of the device and competitor devices should be identified as input to the risk management process for hazard identification.
  2. If the device incorporates software, existing vulnerabilities of the off-the-shelf software (including operating systems) should be identified as part of the cybersecurity risk assessment process.
  3. During the human factors process, you should search for known use errors associated with previous versions of the device and competitor devices; known use-related risks should also include any potential use errors identified during formative testing.

Even though the FDA does not currently require compliance with ISO 13485, the FDA does recognize ISO 14971:2019, and post-market surveillance is identified as an input to the risk management process in Clause 4.2 (see note 2), Clause 10.4, and Annex A.2.10. 

Step 4: Creating a design plan 

You are required to update your design plan as the development project progresses. Most design and development projects take a year before the company is ready to submit a 510k submission to the FDA. Therefore, don’t worry about making your first version of the plan perfect. You have a year to make lots of improvements to your design plan. At a minimum, you should be updating your design plan during each design review. One thing that is important to capture in your first version, however, is the correct regulatory pathway for your intended markets. If you aren’t sure which markets you plan to launch in, you can select one market and add more later, or you can select a few and delete one or more later. Your design plan should identify the resources needed for the development project, and you should estimate when you expect to conduct each of your design reviews.

Contents of your design plan

The requirement for design plans is stated in both Clause 7.3.1 of the ISO Standards, and Section 21 CFR 820.30b of the FDA QSR. You can make your plan as detailed as you need to, but I recommend starting simple and adding detail. Your first version of a design plan should include the following tasks:

  • Identification of the regulatory pathway based on the device risk classification and applicable harmonized standards.
  • Development of a risk management plan
  • Approval of your design plan (1st design review) 
  • Initial hazard identification
  • Documentation and approval of user needs and design inputs (2nd design review) 
  • Risk control option analysis
  • Reiterative development of the product design
  • Risk analysis 
  • Documentation and approval of design outputs implementation of risk control measures (3rd design review) 
  • Design verification and verification of the effectiveness of risk control measures (4th design review)
  • Design validation and verification of the effectiveness of risk control measures that could not be verified with verification testing alone
  • Clinical evaluation and benefit/risk analysis (5th design review)
  • Development of a post-market surveillance plan with a post-market risk management plan
  • Development of a draft Device Master Record/Technical File (DMR/TF) Index
  • Regulatory approval (e.g., 510k clearance) and closure of the Design History File (DHF)
  • Commercial release (6th and final design review)
  • Review lessons learned and initiate actions to improve the design process

Step 5: Create a detailed testing plan

Your testing plan must indicate which recognized standards you plan to conform with, and any requirements that are not applicable should be identified and documented with a justification for the non-applicability. The initial version of your testing plan will be an early version of your user needs and design inputs. However, you should expect the design inputs to change several times. After you receive feedback from regulators is one time you may need to make changes to design inputs. You may also need to make changes when you fail your testing (i.e., preliminary testing, verification testing, or validation testing). If your company is following “The Lean Startup” methodology, your initial version of the design inputs will be for a minimum viable product (i.e., MVP). As you progress through your iterative development process, you will add and delete design inputs based on customer feedback and preliminary testing. Your goal should be to fail early and fail fast because you don’t want to get to your verification testing and fail. That’s why we conduct a “design freeze,” prior to starting the design verification testing and design transfer activities.

Design Timeline with 513g 1024x542 Design Controls Implementation

Step 6: Request a pre-submission meeting with the FDA

Design inputs need to be requirements verified through the use of a verification protocol. If you identify external standards for each design input, you will have an easier time completing the verification activities, because verification tests will be easier to identify. Some standards do not include testing requirements, and there are requirements that do not correspond to an external standard. For example, IEC 62366-1 is an international standard for usability engineering, but the standard does not include specific testing requirements. Therefore, manufacturers have to develop their own test protocol for validation of the usability engineering controls implemented. If your company is developing a novel sterilization process (e.g., UV sterilization), you will also need to develop your own verification testing protocols. In these cases, you should submit the draft protocols to the FDA (along with associated risk analysis documentation) to obtain feedback and agreement with your testing plan. The method for obtaining written feedback and agreement with a proposed testing plan is to submit a pre-submission meeting request to the FDA (i.e., PreSTAR).

Step 7: Iterative development is how design controls really work

Design controls became a legal requirement in the USA in 1996 when the FDA updated the quality system regulations. At that time, the “V-diagram” was quite new and limited to software development. Therefore, the FDA requested permission from Health Canada to reprint the “Waterfall Diagram” in the design control guidance that the FDA released. Both diagrams are models. They do not represent best practices, and they do not claim to represent how the design process is done in most companies. The primary information that is being communicated by the “Waterfall Diagram” is that user needs are validated while design inputs are verified. The diagram is not intended to communicate that the design process is linear or must proceed from user needs, to design inputs, and then to design outputs. The “V-Diagram” is meant to communicate that there are multiple levels of verification and validation testing that occur, and the development process is iterative as software bugs are identified. Both models help teach design and development concepts, but neither is meant to imply legal requirements. One of the best lessons to teach design and development teams is that this is a need to develop simple tests to screen design concepts so that design concepts can fail early and fail fast–before the design is frozen. This process is called “risk control option analysis,” and it is required in clause 7.1 of ISO 14971:2019.

Step 8: “Design Freeze”

Design outputs are drawings and specifications. Ensure you keep them updated and control the changes. When you finally approve the design, this is approval of your design outputs (i.e., selection of risk control options). The final selection of design outputs or risk control measures is often conducted as a formal design review meeting. The reason for this is that the cost of design verification is significant. There is no regulatory or legal requirement for a “design freeze.” In fact, there are many examples where changes are anticipated but the team decides to proceed with the verification testing anyway. The best practice developed by the medical device industry is to conduct a “design freeze.” The design outputs are “frozen” and no further changes are permitted. The act of freezing the design is simply intended to reduce the business risk of spending money on verification testing twice because the design outputs were changed during the testing process. If a device fails testing, it will be necessary to change the design and repeat the testing, but if every person on the design team agrees that the need for changes is remote and the company should begin testing it is less likely that changes will be made after the testing begins.

Step 9: Begin the design transfer process

Design transfer is not a single event in time. Transfer begins with the release of your first drawing or specification to purchasing and ends with the commercial release of the product. The most common example of a design transfer activity is the approval of prototype drawings as a final released drawing. This is common for molded parts. Several iterations of the plastic part might be evaluated using 3D printed parts and machined parts, but in order to consistently make the component for the target cost an injection mold is typically needed. The cost of the mold may be $40-100K, but it is difficult to change the design once the mold is built. The lead time for injection molds is often 10-14 weeks. Therefore, a design team may begin the design transfer process for molded parts prior to conducting a design freeze. Another component that may be released earlier as a final design is a printed circuit board (PCB). Electronic components such as resistors, capacitors, and integrated circuits (ICs) may be available off-the-shelf, but the raw PCB has a longer lead time and is customized for your device.

Step 10: Verification of Design Controls

Design verification testing requires pre-approved protocols and pre-defined acceptance criteria. Whenever possible, design verification protocols should be standardized instead of being project-specific. Information regarding traceability to the calibrated equipment identification and test methods should be included as a variable that is entered manually into a blank space when the protocol is executed. The philosophy behind this approach is to create a protocol once and repeat it forever. This results in a verification process that is consistent and predictable, but it also eliminates the need for review and approval of the protocol for each new project. Standardized protocols do not need to specify a vendor or dates for the testing, but you might consider documenting the vendor(s) and duration of the testing in your design inputs to help with project management and planning. You might also want to use a standardized template for the format and content of your protocol and report. The FDA provides a guidance document specifically for the report format and content for non-clinical performance testing.

Step 11: Validation of Design Controls

Design validation is required to demonstrate that the device meets the user’s and patient’s needs. User needs are typically the indications for use–including safety and performance requirements. Design validation should be more than bench testing. Ensure that animal models, simulated anatomical models, finite element analysis, and human clinical studies are considered. One purpose of design validation is to demonstrate performance for the indications for use, but validating that risk controls implemented are effective at preventing use-related risks is also important. Therefore, human factors summative validation testing is one type of design validation. Human factors testing will typically involve simulated use with the final version of the device and intended users. Validation testing usually requires side-by-side non-clinical performance testing with a predicate device for a 510k submission, while CE Marking submissions typically require human clinical data to demonstrate safety and performance.

Step 12: FDA 510k Submission

FDA pre-market notification, or 510k submission, is the most common type of regulatory approval required for medical devices in the USA. FDA submissions are usually possible to submit earlier than other countries, because the FDA does not require quality system certification or summary technical documents, and performance testing data is usually non-clinical benchtop testing. FDA 510k submissions also do not require submission of process validation for manufacturing. Therefore, most verification and validation is conducted on “production equivalents” that were made in small volume before the commercial manufacturing process is validated. The quality system and manufacturing process validation may be completed during the FDA 510k review.

Step 13: The Final Design Review 

Design reviews should have defined deliverables. We recommend designing a form for documenting the design review, which identifies the deliverables for each design review. The form should also define the minimum required attendees by function. Other design review attendees should be identified as optional—rather than required reviewers and approvers. If your design review process requires too many people, this will have a long-term impact upon review and approval of design changes.

The only required design review is a final design review to approve the commercial release of your product. Do not keep the DHF open after commercial release. All changes after that point should be under production controls, and changes should be documented in the (DMR)/Technical File (TF). If device modifications require a new 510k submission, then you should create a new design project and DHF for the device modification. The new DHF might have no changes to the user needs and design inputs, but you might have minor changes (e.g., a change in the sterilization method requires testing to revised design inputs).

Step 14: FDA Registration

Within 30 days of initial product distribution in the USA, you are required to register your establishment with the FDA. Registration must be renewed annually between October 1 and December 31, and registration is required for each facility. If your company is located outside the USA, you will need an initial importer that is registered and you will need to register before you can ship the product to the USA. Non-US companies must also designate a US Agent that resides in the USA. At the time of FDA registration, your company is expected to be compliant with all regulations for the quality system, UDI, medical device reporting, and corrections/removals.

Step 15: Post-market surveillance is the design control input for the next design project

One of the required outputs of your final design review is your DMR Index. The DMR Index should perform a dual function of also meeting technical documentation requirements for other countries, such as Canada and Europe. A Technical File Index, however, includes additional documents that are not required in the USA. One of those documents is your post-market surveillance plan and the results of post-market surveillance. That post-market surveillance is an input to your design process for the next generation of products. Any use errors, software bugs, or suggestions for new functionality should be documented as post-market surveillance and considered as potential inputs to the design process for future design projects.

Step 16: Monitoring your design controls process

Audit your design controls process to identify opportunities for improvement and preventive actions. Audits should include a review of the design process metrics, and you may consider establishing quality objectives for the improvement of the design process. This last step, and the standardization of design verification protocols in step five (5), are discussed in further detail in another blog by Medical Device Academy.

Design Controls Implementation Read More »

What is MDUFA V?

MDUFA V is the agreement between the FDA and the medical device industry to fund the review of medical device submissions by the FDA.

What is MDUFA V?

The Medical Device User Fee and Modernization Act (MDUFMA or MDUFA) is a set of agreements between the Food and Drug Administration (FDA) and the medical device industry to provide funds for the Office of Device Evaluations (ODE) to review medical device submissions. FDA user fees were first authorized via MDUFMA in 2002 for FY 2003. Each MDUFA reauthorization has lasted five years, and FY 2023 will be the 21st year.

How are the MDUFA V user fees decided?

Section 738A(b)(1) of the FD&C Act requires that the FDA consult with various stakeholders, including representatives from patient and consumer advocacy groups, healthcare professionals, and scientific and academic experts, to develop recommendations for the next MDUFA five-year cycle. The FDA initiated the reauthorization process by holding a public meeting on October 27, 2020, where stakeholders and other public members could present their views on the reauthorization. The following is a list of the four industry groups represented in the MDUFA V negotiations with the FDA:

The FD&C Act further requires that the FDA continue meeting with the representatives of patient and consumer advocacy groups at least once every month during negotiations with the regulated industry to continue discussing stakeholder views on the reauthorization and their suggestions for changes.

What are FDA user fees?

At the very core of it, the FDA user fees fund the FDA Office of Device Evaluation (ODE) budget. Without these user fees, the FDA cannot begin reviewing a medical device submission. This includes 510k, PMA, and De Novo submissions. Before the FDA assigns a reviewer to your submission, you must pay the appropriate device user fee in full unless eligible for a waiver or exemption. If you pay the user fee by credit card, you must allow a few extra days for the user fee to clear. Otherwise, your submission will be placed on “User Fee Hold.” Small businesses may qualify for a reduced fee. The FDA announced the FY 2024 FDA User Fees on July 28, 2023. The FDA will announce the user fees for FY 2025 in a Federal Register notice next August 2024.

When does MDUFA V take effect?

Our team regularly checked the announcement of the MDUFA V user fees from August until the October 5, 2022 announcement. The announcement of the FY 2023 user fees was delayed because Congress did not approve the MDUFA reauthorization until the last week of September. The new user fees were initially expected to take effect on October 1, 2022, but the announcement of actual user fees for 2022 was announced on October 5, 2022. This was two months later than expected.

Why was MDUFA V delayed, and will it happen again?

MDUFA V was delayed because the user fee reauthorization requires an act of Congress. The House of Representatives approved the Food and Drug Amendments of 2022 on June 8, 2022. However, the Senate did not file a bill until after the August recess. There were also differences between the legislation the House and the Senate proposed. Therefore, to ensure that the FDA did not have to furlough employees when MDUFA IV funding expired, the President approved and signed a temporary reauthorization on September 30, 2022. The short-term continuing resolution is a temporary stopgap to fund the FDA until December 16, 2022. However, the continuing resolution covers funding for medical device user fees through September 30, 2027. Therefore, the device industry can expect the FDA to continue to operate regardless of the outcome of temporary policies that expire this December. Still, similar delays occurred with previous MDUFA reauthorization, and we expect more of the same US partisan politics between August 2027 and the November 2027 election.

How much did MDUFA V user fees increase?

The increase is dependent upon the fee type. Annual registration fees are increasing by 14.47% (i.e., $5,672 to $6,493). The MDUFA V user fees increased by a stupendous amount (+55.90%) from $12,745 to $19,870 for the 510k user fees. Yikes! De Novo Classification Requests increased by 17.79% from $112,457 to $132,464. Other submissions increased by similar amounts. For more details, check out the table below (also posted on our homepage).

20190810 075548 300x225 What is MDUFA V?
FDA User Fee FY 2023 represents a 55.90% increase in the 510(k) user fee

FY 2024 User Fees 1024x568 What is MDUFA V?

Do user fees ever decrease?

If we lived in a magical world where gas prices dropped and stayed low, the inflation-adjusted pricing would decrease for FDA user fees. That has happened once, but I fit into skinny jeans once too. The increase in FDA user fees from FY 2023 to FY 2024 was 9.5%, except the Annual Registration Fee, which increased by 17.87% to $7,653.

Why is August 1st important?

August 1st is the first day the FDA accepts Small Business Certification Requests for the new fiscal year. That means any small business that wants to keep small business status needs to reapply, and any new business that qualifies for small business status must also apply. The importance of applying for small business status is how much you could save on your submission. The FDA will complete its review of the Small Business Certification Request within 60 calendar days of receipt. Upon completion of the review by the FDA, the FDA will send you a decision letter with your small business designation number or a justification for denial.

Does small business status expire?

Yes, small business status expires. The small business status expires on September 30 of the fiscal year it is granted. A new MDUFA Small Business Certification Request must be submitted and approved each fiscal year to qualify as a small business. If you forget to reapply for small business status on August 1, you can reapply anytime during the year. Still, you will temporarily lose small business status from October 1 until the qualification is renewed. The good news is there is no fee associated with submitting a Small Business Certification Request. For more information, please visit our webpage dedicated to small business qualifications.

What is MDUFA V? Read More »

Auditor shadowing as an effective auditor training technique

This article reviews auditor shadowing as an effective auditor training technique, but we also identify five common auditor shadowing mistakes.

How do you evaluate auditor competency?

Somewhere in your procedure for quality audits, I’ll bet there is a section on auditor competency. Most companies require that the auditor has completed either a course for an internal auditor or a lead auditor course. If the course had an exam, you might even have evidence of training effectiveness. Demonstrating training competence is much more challenging. One way is to review internal audit reports, but writing reports is part of what an auditor does. How can you evaluate an auditor’s ability to interview people, take notes, follow audit trails, and manage their time? The most common solution is to require the auditor “shadow” a more experienced auditor several times, and then the trainer will “shadow” the trainee.

auditor with clip board 203x300 Auditor shadowing as an effective auditor training technique
If you are shadowing, you are taking notes, so you can discuss your observations with the person you are shadowing later. 

Auditor shadowing in 1st party audits

ISO 19011:2018 defines first-party audits as internal audits. When first-party auditors are being shadowed by a trainer or vice versa, there are many opportunities for training. The key to the successful training of auditors is to recognize teachable moments.

When the trainer is auditing, the trainer should look for opportunities to ask the trainee, “What should I do now?” or “What information do I need to record?” In these situations, the trainer asks the trainee what they should do BEFORE doing it. If the trainee is unsure, the trainer should immediately explain what, why, and how with real examples.

When the trainer is shadowing, the trainer should watch and wait for a missed opportunity to gather important information. In these situations, the trainer must resist guiding the trainee until after the trainee appears to be done. When it happens, sometimes the best tool is simply asking, “Are you sure you got all the information you came for?”

Here are five (5) mistakes that I observed trainers made when they were shadowing:

1. Splitting up, instead of staying together, is one of the more common mistakes I have observed. This happens when people are more interested in completing an audit rather than taking advantage of training opportunities. The trainee may be capable of auditing independently, but this is unfair to the trainee because they need feedback on their auditing technique. This is also unfair to the auditee because it is challenging to support multiple auditors simultaneously. When it is unplanned, trainers may not be available for both auditors. If an audit is running behind schedule, this is the perfect time to teach a trainee how to recover sometime in their schedule. Time management is, after all, one of the most challenging skills for auditors to master.

2. Staying in the conference room instead of going to where the work is done is a common criticism of auditors. If the information you need to audit can be found in a conference room, you could have completed the audit remotely. This type of audit only teaches new auditors how to take notes. These are necessary skills that auditors should master in a classroom before shadowing.

3. Choosing an administrative process is a mistake because administrative processes limit the number of aspects of the process approach that an auditor-in-training can practice. Administrative processes rarely have equipment that requires validation or calibration, and the process inputs and outputs consist only of paperwork, forms, or computer records. With raw materials and finished goods to process, the auditor’s job is more challenging because there is more to be aware of.

4. Not providing honest feedback is a huge mistake. Auditors need to be thick-skinned, or they don’t belong in a role where they will criticize others. Before you begin telling others how to improve, you must self-reflect and identify your strengths and weaknesses. Understanding your perspective, strengths, weaknesses, and prejudices is critical to being a practical assessor. As a trainer, it is your job to help new auditors to self-reflect and accurately rate their performance against objective standards.

5. “Silent Shadowing” has no value at all. By this, I mean shadowing another auditor without asking questions. You should mentally pretend you are doing the audit if you are a trainee. Whenever the trainer does something different from how you would do things, you should make a note to ask, “Why did you do that?” If you are the trainer, you should also mentally pretend you are doing the audit. It is not enough to be present. Your job is to identify opportunities for the trainee to improve. The better the trainee, the more challenging it becomes to identify areas for improvement.  This is why training other auditors have helped me improve my auditing skills.

Auditor shadowing in 2nd party audits

supply chain weakest link Auditor shadowing as an effective auditor training technique

Auditors responsible for supplier auditing are critical to supplier selection, supplier evaluation, re-evaluation, and the investigation of the root cause for any non-conformities related to a supplier. Auditor shadowing is a great tool to teach supplier auditors and other people responsible for supply-chain management what to look at and what to look for when they audit a supplier. If you are developing a new supplier quality engineer responsible for performing supplier audits, observing the auditor during some actual supplier audits is recommended. Supplier audits are defined as second-party audits in the ISO 19011 Standard. The purpose of these audits is not to verify conformity to all the aspects of ISO 13485. Instead, the primary purpose of these audits is to verify that the supplier has adequate controls to manufacture conforming products for your company consistently. Therefore, processes such as Management Review (Clause 5.6) and Internal Auditing (Clause 8.2.2) are not typically sampled during a second-party audit.

The two most valuable processes for a second-party auditor to sample are 1) incoming inspection and 2) production controls. Using the process approach to auditing, the second-party auditor will have an opportunity to verify that the supplier has adequate controls for documents and records for both of these processes. Training records for personnel performing these activities can be sampled. The adequacy of raw material storage can be evaluated by following the flow of accepted raw materials, leaving the incoming inspection area. Calibration records can be sampled by gathering equipment numbers from calibrated equipment used by both processes. Even process validation procedures can be assessed by comparing the actual process parameters being used in manufacturing with the documented process parameters in the most recent validation or re-validation reports.

I recommend having the trainee shadow the trainer during the process audit of the incoming inspection process and for the trainer to shadow the trainee during the process audit of production processes. The trainee should ask questions between the two process audits to help them fully understand the process approach to auditing. Supplier auditors should also be coached on techniques for overcoming resistance to observing processes involving trade secrets or where competitor products may also be present. During the audit of production processes, the trainer may periodically prompt the trainee to gather the information that will be needed for following audit trails to calibration records, document control, or for comparison with the validated process parameters. The “teachable moment” is immediately after the trainee misses an opportunity, but while the trainee is still close enough to go back and capture the missing details.

Are you allowed to shadow a 3rd party auditor or FDA inspector?

qsit inspection Auditor shadowing as an effective auditor training technique

Consider using 3rd party audits and inspections as an opportunity to shadow experienced auditors to learn what they are looking at and what they look for. In addition to shadowing an expert within your own company or an auditor/consultant you hire for an internal audit, you can also shadow a 3rd party auditor or an FDA inspector. This concept was the subject of a discussion thread I ran across on Elsmar Cove from 2005. The comments in the discussion thread supported the idea of shadowing a 3rd party auditor. The process owner (i.e., the manager responsible for that process) should be the guide for whichever process is being audited, and the process owner is responsible for addressing any non-conformities found in the area., The process owner should be present during interviews, but the process owner should refrain from commenting. The 3rd party auditor and the process owner need to know if the person being interviewed was effectively trained and if they can explain the process under the pressure of an audit or FDA inspection. If you are interested in implementing this idea, I recommend using one of two strategies (or both):

  1. Consider having the internal auditor that audited each process shadow the certification body auditor for the processes they audited during their internal audit. This approach will teach your internal auditor what they might have missed, and they will learn what the 3rd party auditors look for to simulate a 3rd party audit more effectively when conducting internal audits.
  2. Consider having the internal auditor that is assigned to conduct the next process audit of each process shadow the certification body auditor for that process. This approach will ensure that any nonconformities observed during the 3rd party audit are checked for the effectiveness of corrective actions during the next internal auditor. Your internal auditor will know precisely how the original nonconformity was identified and the context of the finding.

Auditor shadowing as an effective auditor training technique Read More »

Seven ways to improve quality auditor training

A five-day lead auditor course is never enough. Effective quality auditor training must include practical feedback from an expert.

What is required for quality auditor training?

The key to training auditors to audit is consistent follow-up over a long period of time (1-2 years, depending upon the frequency of audits). I recommend following the same training process that accredited auditors must complete. I have adapted that process and developed seven (7) specific recommendations.

Training the trainer

One of my clients asked me to create a training course on how to train operators. I could have taught the operators myself, but so many people needed training that we felt it would be more cost-effective to train the trainers. Usually, I have multiple presentations archived that I can draw upon, but this time I had nothing. I had never trained engineers on how to be trainers before—at least not formally. I thought about the problems other quality managers have had in training internal auditors and how I have helped the auditors improve. The one theme I recognized was that effective quality auditor training needs to include practical feedback from an experienced auditor. An expert auditor that is training new auditors needs to identify systematic ways to provide feedback, and setting a benchmark for the number of times feedback will be provided is really helpful.

Improve by observing yourself and other quality auditors

Observing someone else is a great way to learn when you are learning any new skill. Interns often do this, which is also a technique used to train new auditors. This technique is called shadowing. You can learn by watching, but eventually, you need to try to do tasks that are beyond your comfort level, and it is best to practice auditing with an expert watching you.

Practice team member audit preparation

Many of the internal auditing procedures we see require new auditors to conduct three audits as team members before they can audit independently. In contrast, notified body auditors join as team members for 10-20 audits before they can act as lead auditors. During the training period, auditors in training observe multiple lead auditors and multiple quality systems. Each audit allows auditors in training to write nonconformities and receive feedback from a lead auditor. At the beginning of quality auditor training, the focus must be on audit preparation. What are the areas of importance, what are the results of previous audits, are there any previous audit findings to close, etc? This preparation can even be done as practice for a hypothetical audit.

During quality auditor training, practice the opening and closing meetings

Opening and closing meetings are one of the first things to teach a new lead auditor. Have new lead auditors rehearse their first few opening and closing meetings with you in private before conducting the opening and closing meetings. Ensure the lead auditor has an opening/closing meeting checklist to help them. Recording practice sessions is enormously helpful because the trainee can watch and observe their mistakes. As trainees get more experience, the opening and closing meetings should have time limits. Finally, you might ask members of top management to challenge the lead auditor with questions. The lead auditor needs to be comfortable with their decisions and the grading of the audit findings.

How to practice audit team leadership

Have new lead auditors conduct team audits with another qualified lead auditor for 10-20 audits before you allow them to conduct an audit alone. Leading the opening and closing meetings is usually the first area new lead auditors master. The most complicated area to learn is managing a team of auditors. Team members will fall behind schedule during audits, or someone will forget to audit a process. As a lead auditor, you must complete the audits for your assigned processes and communicate with the entire team to ensure everyone else is on schedule. As an observer, you must let lead auditors make mistakes and help them realize them. Initially, a trainer will encourage new lead auditors to give themselves more than enough time. As their training progresses, the timing needs to be shorter and more challenging. Ultimately, you have to push the team beyond its capability to teach new lead auditors to recognize problem signs and teach them how to fix the problems.

Shadow auditors virtually with recordings

Live shadowing is challenging for experts and trainees because you are distracted by listening to the auditee and observing the auditor. However, if an audit is recorded, the person shadowing can watch the recording. The audit is already completed, and there is little need to concentrate on the auditee. A recording allows the observer to focus on the auditor. If a new auditor is conducting their first audit, an expert should shadow the trainee for 100% of the audit. Gradually the observation can decrease with each subsequent audit.

Practice note-taking with recorded audits

Taking detailed notes is something that experts take for granted, but I learned a lot by watching FDA inspectors take notes during an inspection. Have a new auditor observe a few audits before they are allowed to participate. Make sure they take notes and explain what you are doing and why they are observing as you conduct audits. Review the notes of new auditors periodically throughout the audit to provide suggestions for improvement and identify missing information. You can also record a supplier audit or internal audit and let a new trainee take notes on the pre-recorded webinar. This eliminates the need to coordinate schedules to involve the trainee.

Quality auditor training should include practicing audit agenda creation

Have new lead auditors submit a draft audit agenda to you before sending it to the supplier or department manager. Usually, the first audit agenda will need revision and possibly multiple revisions. Make sure you train the person to include enough detail in the agenda, and using a checklist or template is recommended. The agenda creation will be part of the audit preparation, and it can be done without time pressure.

How do you audit the auditing process?

Most quality managers are experienced and have little trouble planning an audit schedule. The next step is to conduct the audit. The problem is that there is very little objective oversight of the auditing process. The ISO 13485 standard for medical devices requires that “Auditors shall not audit their own work.” Therefore, most companies will opt for one of two solutions for auditing the internal audit process: 1) hire a consultant or 2) ask the Director of Regulatory Affairs to audit the internal auditing process.

Both of the above strategies for auditing the internal audit process meet the requirements of ISO 13485, but neither approach helps to improve an internal auditor’s performance. I have interviewed hundreds of audit program managers over the years, and the most common feedback audit program managers give is “Change the wording of this finding” or “You forgot to close this previous finding.” This type of feedback is related to the report-writing phase of the audit process. I rarely hear program managers explain how they help auditors improve at the other parts of the process.

When auditors are first being trained, we typically provide examples of best practices for audit preparation, checklists, interviewing techniques, AND reports. After auditors are “shadowed” by the audit program manager for an arbitrary three times, the auditors are now miraculously “trained.” Let’s see if I can draw an analogy to make my point.

That kind of sounds like watching your 16-year-old drive the family car three times and then giving them a license.

About the Author

Rob Packard 150x150 Seven ways to improve quality auditor trainingRobert Packard is a regulatory consultant with 25+ years of experience in the medical device, pharmaceutical, and biotechnology industries. He is a graduate of UConn in Chemical Engineering. Robert was a senior manager at several medical device companies—including the President/CEO of a laparoscopic imaging company. His Quality Management System expertise covers all aspects of developing, training, implementing, and maintaining ISO 13485 and ISO 14971 certifications. 2009-2012, he was a lead auditor and instructor for one of the largest Notified Bodies. Robert’s specialty is regulatory submissions for high-risk medical devices, such as implants and drug/device combination products for CE marking applications, Canadian medical device applications, and 510(k) submissions. The most favorite part of his job is training others. He can be reached via phone at 802.258.1881 or by email. You can also follow him on LinkedIn, Twitter, and YouTube.

Seven ways to improve quality auditor training Read More »

Scroll to Top