4.2.8 Guidelines for SB/SE National Quality Review 4.2.8.1 Program Scope and Objectives 4.2.8.1.1 Background 4.2.8.1.2 Authority 4.2.8.1.3 Roles and Responsibilities 4.2.8.1.3.1 Program Manager Responsibilities 4.2.8.1.3.2 Quality Analysts Responsibilities 4.2.8.1.3.3 Front-Line Manager Responsibilities 4.2.8.1.3.4 Reviewer Responsibilities 4.2.8.1.4 Program Reports and Effectiveness 4.2.8.1.5 Program Controls 4.2.8.1.6 Terms and Acronyms 4.2.8.1.7 Related Resources 4.2.8.2 Overview of National Quality Review Process 4.2.8.2.1 Quality Attributes 4.2.8.2.2 Evaluating and Coding the Attributes 4.2.8.2.3 Attribute Scoring System 4.2.8.3 Case Review Procedures 4.2.8.3.1 Review of Electronic Case File 4.2.8.3.2 DCI Header Input Procedures 4.2.8.3.3 Reason Code Selection and Writing Guidelines for Attribute Narratives 4.2.8.4 Field Exam Case Sampling Criteria 4.2.8.5 Specialty Exam Case Sampling Criteria 4.2.8.6 Overview of National Quality Review Case Selection Procedures 4.2.8.6.1 Field and Office Exam Case Selection Procedures 4.2.8.6.2 Employment, Estate and Gift/Excise Case Selection Procedures 4.2.8.6.3 BSA Case Selection Procedures 4.2.8.6.4 Unagreed Appeals Case Selection Procedures 4.2.8.6.5 Defaulted Case Selection Procedures 4.2.8.6.6 Shipping Physical Case Files Selected for Quality Review 4.2.8.6.7 Sample Select Case Control Procedures 4.2.8.7 Case Review Consistency 4.2.8.8 Use and Limitations of National Quality Review Data 4.2.8.9 Field Exam Case Return Criteria 4.2.8.9.1 Specialty Exam Case Return Criteria Exhibit 4.2.8-1 National Standard Time Frames for Case Action Part 4. Examining Process Chapter 2. General Examining Procedures Section 8. Guidelines for SB/SE National Quality Review 4.2.8 Guidelines for SB/SE National Quality Review Manual Transmittal July 12, 2022 Purpose (1) This transmits a revision to IRM 4.2.8, Examining Process, General Examining Procedures, Guidelines for SB/SE National Quality Review. Material Changes (1) Editorial changes were made throughout this IRM to add clarity, readability, and to eliminate redundancies. Website addresses, legal references, and IRM references were reviewed and updated as necessary. (2) Significant changes to this IRM are reflected in the table below: IRM Description of Change 4.2.8.1 Program Scope and Objectives Updated IRM citations at 4.2.8.1(4), 4.2.8.1(5). Added new IRM citation at 4.2.8.1(8) 4.2.8.1.1 Background Added new content at 4.2.8.1.1(5), moved former 4.2.8.1.1(5) to 4.2.8.1.1(6) and updated content and hyperlink, and moved former 4.2.8.1.1(6) to 4.2.8.1.1(7) and updated IRM citation 4.2.8.1.2 Authority Deleted old content at 4.2.8.1.2(1) and replaced with new, updated citation at 4.2.8.1.2(2), deleted content at 4.2.8.1.2(3) and 4.2.8.1.2(4) and replaced with new TBOR language at 4.2.8.1.2(3) as requested by Chief Counsel 4.2.8.1.3.2 Quality Analyst Responsibilities Deleted content at 4.2.8.1.3.2(2) as these duties now responsibility of Specialty Exam 4.2.8.1.5 Program Controls Updated content and hyperlink at 4.2.8.1.5(2), and added clarifying language at 4.2.8.1.5(3) 4.2.8.1.6 Terms and Acronyms Modified two terms SB/SE Field Exam and SB/SE Specialty Exam and their definition for clarity 4.2.8.1.7 Related Resources Added new URL at 4.2.8.1.7(4) 4.2.8.2.1 Quality Attributes Modified language in last bullet point at 4.2.8.2.1(1) and added content at 4.2.8.2.1(3) for clarity 4.2.8.3.1 Review of Electronic Case File Updated content at 4.2.8.3.1(1) and added note at 4.2.8.3.1(3) for clarity 4.2.8.5 Specialty Exam Case Sampling Criteria Added content at 4.2.8.5(2) to exclude Form 1041 cases 4.2.8.6 National Quality Review Case Selection Procedures Added word overview to title for clarity, deleted outdated content found in 4.2.8.6(3), moved content at 4.2.8.6(4) to new 4.2.8.6.3 4.2.8.6.1 Unagreed Appeals Case Selection Procedures Moved content to 4.2.8.6.4 and added new procedures relating to electronic case files 4.2.8.6.2 Defaulted Case Selection Procedures Moved content to 4.2.8.6.5 and added new procedures relating to defaulted electronic case files 4.2.8.6.3 Shipping Sample Select Cases Moved content to 4.2.8.6.6 and updated content for clarity 4.2.8.6.4 Sample Select Control Procedures Moved content to 4.2.8.6.7 NEW 4.2.8.6.1 Field and Office Exam Case Selection Procedures Added new procedures for physical and electronic cases selected for review NEW 4.2.8.6.2 Employment, Estate and Gift and Excise Case Selection Procedures Added new procedures for physical and electronic cases selected for review 4.2.8.9 Case Return Criteria Update title to Field Exam Case Return criteria reflect guidance only applies to field and office exam case returns NEW 4.2.8.9.1 Specialty Exam Case Return Criteria Content added for case return procedures for Employment, Estate and Gift/Excise cases that meet return criteria (3) Editorial changes were made throughout this IRM to add clarity, readability, and to eliminate redundancies. Website addresses, legal references, and IRM references were reviewed and updated as necessary. Effect on Other Documents This material supersedes IRM 4.2.8 dated October 6, 2020 Audience Small Business/Self-Employed (SB/SE) Field and Specialty Exam Employees. Effective Date (07-12-2022) Garrett Gluth Director, Exam Quality and Technical Support Small Business/Self-Employed 4.2.8.1 (07-12-2022) Program Scope and Objectives General Overview. Field and Specialty Exam Quality (FSEQ) supports the Small Business/Self Employed (SB/SE) quality improvement program, providing an assessment of the quality of Field and Specialty Examination case work. Purpose. This IRM section contains general information and procedural guidance relating to the SB/SE Field and Specialty Exam National Quality Review program. Audience. The audience is employees and management officials in FSEQ as well as SB/SE stakeholders. Policy Owner. The Director, Exam Quality and Technical Support (EQ&TS), is responsible for the policies related to the National Quality Review program. Refer to IRM 1.1.16.5.5.4 , Exam Quality and Technical Support for more information. Program Owner. The Program Manager, FSEQ is responsible for overseeing the National Quality Review program. Refer to IRM 1.1.16.5.5.4.5, Field and Specialty Exam Quality for more information. Program Goals. The goal of the National Quality Review program is to provide a practical and accurate method of assessing organizational performance in support of the balanced measures. Primary Stakeholder. The Director Examination SB/SE. Additional stakeholders are Directors located in: Headquarters Examination Field Examination Field and Campus Policy Specialty Policy Specialty Tax Contact Information. To recommend changes or make any other suggestions related to this IRM section, see 1.11.6.5 , Providing Feedback About an IRM Section - Outside of Clearance 4.2.8.1.1 (07-12-2022) Background Embedded Quality (EQ) creates a link between individual performance and organizational goals. This linkage is achieved through a common set of attributes that both national quality reviewers in FSEQ and front-line managers use to evaluate the quality of case work. EQ reviews focus on whether the examiner took the right actions at the right time while protecting taxpayer rights. National quality reviewers in FSEQ use the National Quality Review System (NQRS), an automated web-based system, to record results from case reviews for the following programs: Field and Office Examination Bank Secrecy Act (BSA) Employment Tax Estate and Gift Tax Excise Tax Reports generated from NQRS provide data which may be used to evaluate organizational processes, procedures and successes, and identify areas in need of improvement. The Quality Knowledge Base contains Examination Quality program information, including Embedded Quality Review System (EQRS) and National Quality Review System (NQRS) system guidance. The Quality Knowledge Base is located at https://portal.ds.irsnet.gov/sites/vl115/pages/default.aspx. Managers use the Embedded Quality Review System (EQRS) database to evaluate employee performance. For more information regarding front-line manager use of EQRS see IRM 1.4.40.3.6, Performance Feedback.. Note: NQRS data is never used to evaluate employee performance. 4.2.8.1.2 (07-12-2022) Authority IRM 1.2.1.2.2, Policy Statement 1-2, Principles of Quality, provides the authoritative basis for the procedures in this IRM. 26 CFR 801.6(b) states that quality measures focus on whether IRS personnel: Devoted an appropriate amount of time to a matter Properly analyzed the facts of the situation Complied with statutory, regulatory and IRS procedures Took timely actions Provided adequate notification and made required contacts with taxpayers The Taxpayer Bill of Rights (TBOR) lists rights that already existed in the tax code, putting them in simple language and grouping them into 10 fundamental rights. Employees are responsible for being familiar with and acting in accord with taxpayer rights. See IRC 7803(a)(3), Execution of Duties in Accord with Taxpayer Rights. For additional information about the TBOR, see https://www.irs.gov/taxpayer-bill-of-rights. 4.2.8.1.3 (10-06-2020) Roles and Responsibilities Listed below are the primary roles and responsibilities of the FSEQ program manager, quality analysts, management and quality reviewers involved in the quality review process. 4.2.8.1.3.1 (10-06-2020) Program Manager Responsibilities FSEQ program manager primary responsibilities include: Overseeing and allocating resources for FSEQ Coordinating the development of the annual case review sample plan for FSEQ Ensuring that case review inventory is sufficient for each Field and Specialty Exam Area or program based on the sample plan Monitoring the delivery of the Field and Specialty Exam national sampling plan Coordinating issues relating to interpreting and rating the quality attributes Establishing protocol to measure, monitor, and improve reviewer accuracy and consistency Sharing analysis of NQRS data to aid in organizational improvement and influence quality performance Providing quality review data and/or analysis to internal/external stakeholders on an ad hoc or recurring basis Coordinating with stakeholders in the development of attributes and requirements for quality reviews Providing recommendations to enhance NQRS 4.2.8.1.3.2 (07-12-2022) Quality Analysts Responsibilities FSEQ analyst are responsible for: Developing and distributing quality performance reports Developing the annual case review sample plan Reviewing attribute narratives on a regular basis to ensure guidelines are followed Participating in group meetings to promote consistency, including the discussion of specific attributes and case scenarios Developing and clarifying review criteria and procedures to promote consistency Providing quality review data and/or analysis to internal/external stakeholders on an ad hoc or recurring basis Coordinating with stakeholders in their quality improvement initiatives Collaborating with stakeholders in the development of attributes and requirements for quality reviews Working with stakeholders in monitoring and updating job aids, instructional guides and quality review procedures in accordance with IRM and program guidelines 4.2.8.1.3.3 (07-12-2022) Front-Line Manager Responsibilities FSEQ front-line manager responsibilities include: Providing guidance for program objectives Ensuring reviewers understand and adhere to program guidelines Ensuring accurate and consistent application of the quality attributes Reviewing attribute narratives on a regular basis to ensure guidelines are followed Critiquing completed reviews on a regular basis and providing meaningful feedback to reinforce expectations for quality case reviews Conducting group meetings to promote consistency, including the discussion of specific attributes and case scenarios Ensuring accuracy of data input Ensuring the sample plan is followed Monitoring sample plan and recommending actions to address imbalances Maintaining instructional guides for national quality reviewers Reviewing and approving case returns that meet criteria found in IRM 4.2.8.9 ,Returning Cases to the Field. Reviewing and approving rejection of cases that do not meet case sampling criteria found in IRM 4.2.8.4 and IRM 4.2.8.5. Sharing trends and issues that may have nationwide impact Providing input during the attribute development or update process 4.2.8.1.3.4 (10-06-2020) Reviewer Responsibilities FSEQ reviewers responsibilities include: Evaluating examination case quality by conducting reviews of completed SB/SE Field and Specialty Exam cases Accurately and consistently applying the attributes utilizing the appropriate Job Aid and tools such as the IRM and Internal Revenue Code Completing timely case reviews using the Data Collection Instrument (DCI) Completing timely and accurate input of review data into the NQRS database Identifying the appropriate reason code(s) for each not met attribute rating Writing clear and meaningful attribute narrative comments for each not met attribute rating Elevating potential conflicts in the IRM and the Job Aid for resolution Assisting in data analysis as warranted 4.2.8.1.4 (10-06-2020) Program Reports and Effectiveness Program reports are available on NQRS by selecting Reports from the main menu screen. FSEQ also generates quarterly performance reports for stakeholders. These reports provide data to aid in: Establishing baselines to assess program performance Identifying quality strengths and weaknesses Determining specific training/educational needs Identifying opportunities to improve work processes Measuring the success of quality improvement efforts An overall quality score serves as the Balanced Measure for Business Results – Quality. This measure is reported to various levels of the organization and to external stakeholders such as Congress. 4.2.8.1.5 (07-12-2022) Program Controls Access to EQRS and NQRS data and reports is controlled based on the user’s assigned permission level, assigned function and assigned organization. System coordinators are responsible for assigning users to the appropriate permission level based on the user’s role in the organization. Users are only given privileges that are required for the user to perform their jobs. Users do not have access to security and other functions/features that require elevated privileges. Access to EQRS and NQRS is through the Business Entitlement Access Request System (BEARS) at https://bears.iam.int.for.irs.gov/home/Index. EQRS/NQRS systems contain input validation checks to ensure input accuracy and completeness by: Restricting data input to established system parameters to ensure data accuracy Using drop down lists as much as possible to restrict users from typing invalid information Displaying an error message if invalid data is input into the system Requiring data field input before proceeding Operations Support, Technology Solutions, Collection Systems provide core information technology management and support services for both EQRS and NQRS. They are responsible for: Ensuring compliance with the Federal Information Security Management Act (FISMA) Managing Unified Work Requests (UWR) for system updates and changes Leading the development of enhanced data and computer security process and controls 4.2.8.1.6 (07-12-2022) Terms and Acronyms The following table contains commonly used terms and acronyms found in this IRM: Terms and Acronyms Definition BSA Bank Secrecy Act CCP Centralized Case Processing CJE Critical Job Element CEAS Correspondence Examination Automation Support DCI Data Collection Instrument EQ Embedded Quality EQ&TS Exam Quality & Technical Support FSEQ Field and Specialty Exam Quality FEQ Field Exam Quality is responsible for cases selected for quality review from revenue agents, tax compliance officers and tax auditors located in Field Examination EQRS Embedded Quality Review System ERCS Examination Returns Control System IMS Issue Management System IRM Internal Revenue Manual ITAMS Information Technology Asset Management System NQRS National Quality Review System SB/SE Small Business/Self Employed SPRG Specialized Product Review Group SEQ Specialty Exam Quality is responsible for cases selected for quality review from revenue agents, attorneys, revenue officer examiners and fuel compliance agents located in Specialty Examination UWR Unified Work Request 4.2.8.1.7 (07-12-2022) Related Resources Field and Specialty Exam job aids are reference tools used by Field and Specialty Exam management and FSEQ review staff to aid in rating the quality attributes in a uniform and consistent manner. Guidelines in the job aids align the EQ concepts to current Field and Specialty Exam procedures. IRM references support each quality attribute. Headquarters Examination, Examination Field and Campus Policy is responsible for ensuring the consistency of the Field and Office Exam job aids and training materials, along with the IRM and other guidelines. Headquarters Examination, Specialty Policy is responsible for ensuring the consistency of the Specialty Exam job aids and training materials, along with the IRM and other guidelines. Links to the job aids may be found on the Quality Knowledge Base which is located at https://portal.ds.irsnet.gov/sites/vl115/pages/default.aspx. 4.2.8.2 (10-06-2020) Overview of National Quality Review Process The quality review process provides data to measure, monitor and improve the quality of work. Organizational performance is measured by conducting independent case reviews from a statistically valid sample of examination case work. Specific measurement criteria, referred to as quality attributes, are used to evaluate the quality of case work. 4.2.8.2.1 (07-12-2022) Quality Attributes Quality attributes address whether: Timely service was provided to the taxpayer Facts of the case were properly analyzed Law was correctly applied Taxpayer rights were protected by following applicable IRS policies and procedures including timeliness, adequacy of notifications, and required contacts with taxpayers Appropriate determination was reached regarding liability for tax Quality attributes are organized into measurement categories which allow quality data to be generated based on the following criteria: Timeliness - resolving issues in the most efficient manner through proper time utilization and workload management techniques Professionalism - promoting a positive image of the Service by using effective communication techniques Regulatory Accuracy - adhering to statutory/regulatory process requirements Procedural Accuracy - adhering to internal process requirements Quality attributes can also be organized by the following DCI attribute groups: Planning Income Determination (Field Exam) Investigative/Audit Techniques Timeliness Customer Relations/Professionalism Documentation/Reports 4.2.8.2.2 (10-06-2020) Evaluating and Coding the Attributes Reviewers evaluate case work utilizing attributes specific to their Specialized Product Review Group (SPRG). Reviewers rate all attributes that apply to the case being reviewed. Attribute ratings must be accurate and consistent. Reviewers must strive for consistency in rating similar case actions. 4.2.8.2.3 (10-06-2020) Attribute Scoring System The scoring system provides for the equal weighting of each attribute. Each attribute is rated as Yes, No, or in some instances Not Applicable. The quality score is computed as a percentage. The percentage is calculated as total Yes ratings divided by total Yes and No ratings. A total score of 100 percent is possible for each case. 4.2.8.3 (10-06-2020) Case Review Procedures The DCI is the principal documentation for the reviewer’s case evaluation and conclusions. A DCI is completed for each case reviewed in NQRS. Reviewers must ensure that all entries on the DCI are accurate and records are not duplicated. Reviewers will review one case at a time to completion before starting another case review. Steps in the review process include: Review of the paper case file and the electronic file, where applicable Input case review data on the DCI and prepare narratives to explain each not met attribute rating Review the DCI for accuracy and narrative quality Edit the DCI as necessary Complete case review 4.2.8.3.1 (07-12-2022) Review of Electronic Case File Either a physical or electronic case may be assigned for review. If a physical case file is assigned for review, the physical case file is the primary source. Documents found in the electronic case file might not be in the physical case file because they were not printed or were inadvertently removed. If there are indications in the physical case file that electronic documents exist, reviewers should access the electronic file to determine if additional information is available. Electronic case files for the Excise, Employment and Estate and Gift programs are located on the Issue Management System (IMS) Team Site. Note: An approved BEARS request is required for access to the IMS Team Site. Electronic case files for Field Exam are located in Correspondence Examination Automation Support (CEAS). 4.2.8.3.2 (10-06-2020) DCI Header Input Procedures The first input section of the DCI is the header fields that capture basic case information. The bold header fields are mandatory and must be entered to complete the DCI. Header information is categorized into four groupings: Review Information - specific information about the review itself Case Information - specific information about the case Process Measures - case actions taken by the examiner that are used to measure the efficiency of the examination process Special Use - special tracking for local or national purposes Process Measures data may be analyzed in conjunction with the quality attributes. Process Measures data fields capture: Specific tasks performed during the examination How these tasks were completed Key dates Delays in activities Hours associated with the case 4.2.8.3.3 (10-06-2020) Reason Code Selection and Writing Guidelines for Attribute Narratives When a quality attribute is rated not met, at least one reason code, if available, must be selected that supports the not met rating. The most appropriate reason code should be selected for the error. Multiple reason codes may be selected for multiple errors, if warranted. A narrative is required, describing the facts, for each not met attribute rating. Reviewers should contact their manager when other is used regularly as a reason code to determine if additional reason codes should be added to NQRS. Reviewer narratives must be thorough, providing clear, concise, and specific descriptions of any errors, offering sufficient detail to allow for specific recommendations for improvement. Reviewers must avoid using canned statements in their narratives. Attribute narratives should: Clearly state the facts that resulted in the attribute rating Identify the nature of the error in the first sentence of the narrative Indicate what was not done, not what should have been done Evaluate the case, not the examiner Reviewer should not include taxpayer specific or Personally Identifiable Information (PII) data in the narrative comments. 4.2.8.4 (10-06-2020) Field Exam Case Sampling Criteria The following Field Exam cases are included in the review sample: SB/SE revenue agent and tax compliance officer income tax cases (corporations, partnerships, and individual returns) Agreed, partially agreed, unagreed, no-change, and cases protested to Appeals Secured delinquent returns not accepted as filed Training cases Form 1041 , U.S. Income Tax Return for Estates & Trusts, Form 1042 , Annual Withholding Tax Return for U.S. Source Income of Foreign Persons, and Form 1120-F , U.S. Income Tax Return of a Foreign Corporation, tax returns examined by revenue agents Correspondence cases examined by revenue agents, tax auditors, and tax compliance officers Pre-assessment innocent spouse cases Claims Audit reconsideration cases Employment tax cases if they are closed as related cases to an income tax case (the entire related case package is included) The following Field Exam cases are excluded from the national quality review sample: Secured delinquent returns accepted as filed Penalty cases not included as part of an examination case Surveyed returns Offer in Compromise (OIC) cases Post-assessment innocent spouse cases Surveyed claim cases (Disposal Code 34) No show/no response cases Protested cases with 395 days or less remaining on the statute Note: If the case selected for review is a protested case to Appeals, there must be at least 395 days remaining on the statute of limitations. Appeals policy requires a non-docketed case to have at least 365 days remaining on the statute of limitations as of the date the case is received in Appeals. The additional 30 days is required to allow for the completion of the review and for Technical Services to receive and prepare the case for closing to Appeals. Petitioned cases Cases updated to suspense status Cases updated to group status after 90-day letter issued Cases closed via Form 906, Closing Agreement Specific project codes as determined by Headquarters Examination 4.2.8.5 (10-06-2020) Specialty Exam Case Sampling Criteria The following Specialty Exam cases are included in the review sample: Excise tax Estate and Gift tax Employment tax where there is no related income tax case BSA Title 31 and Form 8300 The following Specialty Exam cases are excluded from the national quality review sample: Secured delinquent returns accepted as filed Penalty cases not included as part of an examination case Surveyed returns Offers in Compromise (OIC) cases Post-assessment innocent spouse cases Surveyed claim cases (Disposal Code 34) No show/no response cases Protested cases with 395 days or less remaining on the statute of limitations Note: If the case selected for review is a protested case to Appeals, there must be at least 395 days remaining on the statute of limitations. Appeals policy requires a non-docketed case to have at least 365 days remaining on the statute of limitations as of the date the case is received in Appeals. The additional 30 days is required to allow for the completion of the review and for Technical Services to receive and prepare the case for closing to Appeals. Petitioned cases Cases updated to suspense status Cases updated to group status after 90-day letter issued Cases closed via Form 906, Closing Agreement Specific project codes as determined by Headquarters Examination Activity Code 421 returns -Gift Form 706GS(D) ,Generation-Skipping Transfer Tax Return for Distributions and Form 706GS(T),Generation-Skipping Transfer Tax Return For Terminations Cases worked by Estate and Gift support staff, paraprofessional -Position Code 316 and audit accounting aide -Position Code 301 Estate and Gift returns assigned outside of the Estate and Gift area Excise Form 2290 , Heavy Highway Vehicle Use Tax Return Excise returns assigned outside of the Excise area Employment Tax Form 1040tip cases Form 1041, U.S. Income Tax Return for Estates and Trusts 4.2.8.6 (07-12-2022) Overview of National Quality Review Case Selection Procedures The Examination Record Control System (ERCS) Sample Review program automates the process of randomly selecting a valid sample of cases meeting the sampling criteria for review. The sample size is statistically valid at the Field Exam Area level and the Specialty Exam Program level. The annual sample plan is based on projected fiscal year closures for each SB/SE program. Cases meeting the sample criteria are selected by the ERCS Sample Review program at the designated sample rate for the Field Exam Area and for three of the Specialty Exam Programs (Excise, Employment, Estate and Gift). Cases are subject to the sample at the point they move to Status Code 51 (In transit to Centralized Case Processing) or 21 (In transit to Technical Services). BSA cases are not controlled on ERCS. See IRM 4.2.8.6.3, BSA Case Selection Procedures, for more information. 4.2.8.6.1 (07-12-2022) Field and Office Exam Case Selection Procedures For physical case files selected for review, after the case has been processed by CCP and updated to Status Code 90 (Closed), CCP will ship the physical file to Field and Office Quality (FEQ) support staff. Upon receipt from CCP, the FEQ support staff will enter the case into the unassigned inventory for assignment to a reviewer. For electronic case files selected for review, after the case has been processed by CCP and updated to Status Code 90, the FEQ support staff will enter the case into the unassigned inventory for assignment to a reviewer. 4.2.8.6.2 (07-12-2022) Employment, Estate and Gift/Excise Case Selection Procedures For physical case files selected for review, after the case has been processed by CCP and updated to Status Code 90, CCP will ship the physical file to Specialty Exam Quality (SEQ) support staff. Upon receipt from CCP, the SEQ support staff will enter the case into the unassigned inventory for assignment to a reviewer. For electronic case files selected for review, after the case has been processed by CCP and updated to Status Code 90, the SEQ support staff will enter the case into the unassigned inventory for assignment to a reviewer. 4.2.8.6.3 (07-12-2022) BSA Case Selection Procedures Form 8300 cases are selected from the weekly extract of closed cases maintained by Enterprise Computing Center - Detroit (ECC-DET) and shipped to SEQ. Title 31 cases are selected from the closed case Title 31 database using the NQ interface and shipped to SEQ. 4.2.8.6.4 (07-12-2022) Unagreed Appeals Case Selection Procedures The ERCS Sample Review program may select unagreed cases for review. Technical Services is responsible for sending physical unagreed Appeals cases and physical unagreed Appeals cases with at least one agreed/no-change year that are selected for sample review to the appropriate review site. These cases are high priority and procedures are established to ensure their timely review. Refer to IRM 4.8.2.3.4, Technical Services, Case Processing for more information. For unagreed electronic Appeals cases and unagreed electronic Appeals cases with at least one agreed/no-change year, the FEQ support staff will add the electronic case to review inventory. When “open” cases are transmitted to the review site by Technical Services, they should be updated to Status Code 23, Sample Review and Review Type 33 on ERCS. Reviewers will complete their review of the open unagreed case within 10 business days and return the physical case files to Technical Services via ground mail service. When the review of an electronic case file has been completed, the group manager will e-mail the Technical Services manager to advise that the review is complete. Appeals policy requires a non-docketed case to have at least 365 days remaining on the statute of limitations as of the date the case is received in Appeals. If a non-docketed case is selected for sample review there needs to be an additional 30 days on the statute of limitations to allow for the completion of the review and for Technical Services to receive and prepare the case for closing to Appeals. Cases that do not meet this criteria will be deselected and returned to Technical Services. Reviewers prepare an Appeals Advisory Memo to Technical Services when tax application/computation errors are found or if any taxpayer confidentiality issues are discovered. Technical Services decides whether to forward the case to Appeals or return it to the group. 4.2.8.6.5 (07-12-2022) Defaulted Case Selection Procedures The ERCS Sample Review program may also select unagreed cases closing for issuance of statutory notice of deficiency as part of the random sample of cases for review. Technical Services will affix the sample selection sheet to these case files and update all returns with an account transfer out freeze code "M" . If the case defaults, Technical Services will send the case to CCP. The freeze code "M" along with the Sample Selection Sheet will alert CCP that the case must be sent to the appropriate review site. CCP will update the case to Status Code 90 (Closed), remove the freeze code "M" and forward the physical case(s) to FEQ support staff. For electronic case files, when CCP updates the case to Status Code 90 (Closed) and removes the freeze code "M " , the FEQ support staff will add the case to unassigned inventory. 4.2.8.6.6 (07-12-2022) Shipping Physical Case Files Selected for Quality Review Physical cases selected for review should be transmitted to their respective review site. Field Exam physical cases are shipped to FEQ support staff. Specialty Exam physical cases are shipped to SEQ support staff. After screening, physical case files are shipped directly to reviewers. Closed case files should remain intact after they leave the Employment, Estate and Gift and Excise groups and Technical Services. Dismantling, purging, or discarding documents from a case file could negatively affect the case if legal actions are pursued. A separate Form 3210, Document Transmittal shall be attached to the closed case files. Each selected case shall include the full physical case file. 4.2.8.6.7 (10-06-2020) Sample Select Case Control Procedures Each review site will maintain an inventory control system. This will facilitate an orderly flow of case files and supporting documents between closing units, the review site, and the reviewer. All closed (Status Code 90) physical case files along with Form 3210 are transported via ground shipment for final disposition. 4.2.8.7 (10-06-2020) Case Review Consistency The reviewer’s case evaluation must be accurate and consistent to provide reliable and meaningful results. The FSEQ front-line manager should periodically perform consistency checks to ensure consistent and accurate application of the quality attributes and accurate data input. The FSEQ front-line manager may conduct consistency reviews in several ways, including: Have each reviewer independently review the same case and discuss any inconsistencies in attribute rating Critique completed case reviews and provide feedback to reinforce expectations of the review outcomes Utilize NQRS reports including reviewer narratives, to evaluate consistency, ensure guidelines are followed and ensure the narratives are clearly and professionally written Hold group meetings to discuss specific attributes and case scenarios Results of the consistency reviews are maintained and updated as warranted. 4.2.8.8 (07-12-2022) Use and Limitations of National Quality Review Data The fundamental purpose of the National Quality Review program is to provide an overall organizational assessment of case quality. Quality review results are statistically valid and reliable measurements of the overall quality of casework completed by SB/SE Field Exam only at the Area level. Specialty Exam national quality results are statistically valid at the program level. Results stratified to any lower organizational segment are not statistically reliable measurements of the quality of casework at those levels. Lower organizational segment stratifications are indicators. They should be relied upon only to the extent that they are confirmed by other reliable management measures of quality. The design and format of quality review reports within NQRS as well as access to the reports and data will be determined by the Field and Specialty Exam program manager. No attempt should be made to associate specific review results to a particular case. Review data is used to assess program performance and will not be used to evaluate individual employee performance. Any feedback or other data generated from NQRS will not be used as a substitute for EQRS case reviews, on the job visits, workload or any other reviews. 4.2.8.9 (07-12-2022) Field Exam Case Return Criteria FSEQ reviewers will follow guidance found in the Technical Services IRM 4.8.2.9 ,Returning Cases to the Field, which outlines return criteria for cases with potential for significant impact to taxpayer compliance or to tax revenues. 4.2.8.9.1 (07-12-2022) Specialty Exam Case Return Criteria SEQ reviewers will also follow guidance found in the Technical Services IRM 4.8.2.9’Returning Cases to the Field, which outlines return criteria for cases with potential for significant impact to taxpayer compliance or to tax revenues. Additional guidance is provided for SEQ for cases that do not close to Technical Services. SEQ reviewer will take the following actions for Employment, Estate and Gift and Excise cases meeting the case return criteria found in IRM 4.8.2.9.1 ,Case Return Criteria Prepare case return memo. The memo will address the facts, law, and recommended actions needed for the case. Consult with Specialty Exam Policy Subject Matter Expert (SME) to resolve any technical issues relating to the case if warranted. Forward case return memo to the SEQ front-line manager for concurrence. SEQ front-line manager will send case return memo to the quality analyst for concurrence. The quality analyst will send the case return memo to the Specialty field front-line manager with a copy to the territory manager, program policy analyst responsible for quality and field technical advisor. The Specialty field front-line manager will provide SEQ front-line manager with shipping instructions. SEQ reviewer will ship case per instructions from Specialty field group front-line manager following current shipping guidelines. Note: Document all case return actions in the case file. The returned case will remain in Status Code 90, Closed. The decision to act on the SEQ reviewer recommendations is the responsibility of the Specialty field front-line and territory manager. The Specialty field front-line manager is responsible for shipping the case to files once any needed actions are completed. Document all follow-up actions in the case file. Exhibit 4.2.8-1 National Standard Time Frames for Case Action Activity - Type of exam action or activity measured Days - Maximum number of calendar days permitted for the exam action or activity Measured From - Start of the exam action or activity Measured To - End of the exam action or activity The national recommended standard timeframes (unless noted, measured in calendar days) are shown in the table below: Activity Program Days Measured From Measured To Start Examination Field Exam, Excise, Employment, BSA 45 First action First Appointment Contact Estate and Gift 45 Examiner’s receipt of case Date examiner sends an initial contact letter to the taxpayer with a copy to the representative, or surveys the assigned case Significant Activity Field and Specialty Exam 45 Last significant action Next significant activity Response to call Field and Specialty Exam 1 business day Taxpayer or representative telephone call Return telephone call to the taxpayer or representative Response to correspondence Field and Specialty Exam 14 Receipt of correspondence or documentation from taxpayer or representative Provide follow up response to taxpayer or representative POA Processing Specialty Exam as soon as possible or within 24 hours of the receipt date Receipt of Form 2848 Submission to CAF Unit for processing Agreed/No Change Case Closing Field Exam, 10 Date the report is received or the date the no-change status is communicated to the taxpayer Date the case is closed from the group Agreed/No Change Case Closing Estate and Gift, Excise 30 Date the report is received or the date the no-change status is communicated to the taxpayer/financial institution Date the case is closed from the group Agreed/No Change Case Closing Employment 20 Date the report is received or the date the no-change status is communicated to the taxpayer Date case is updated to status 51 and closed from the group Agreed/No Change Case Closing BSA 20 Date closing letter finalized Date case closed from the group Agreed cases with unpaid proposed assessments of $100,000 and greater Field and Specialty Exam 4 Date the report is received Date case is closed from the group Unagreed Case Closing Field and Specialty Exam 20 Date the 30–Day Letter defaults or the date the request for appeals conference is received Date the case is closed from the group More Internal Revenue Manual