Mississippi Department of Human Services, DAB No. 1267 (1991)

Department of Health and Human Services

DEPARTMENTAL APPEALS BOARD

Appellate Division

SUBJECT:  Mississippi Department  of    Human Services

DATE:  July 25, 1991
Docket No. 89-3
Decision No. 1267

DECISION

The Mississippi Department of Human Services (State) appealed a funding
reduction imposed under section 403(h) of the Social Security Act (Act)
by the Office of Child Support Enforcement (OCSE). 1/  Based on audits
of the State's child support enforcement and paternity establishment
program, OCSE determined that the State did not comply substantially
with the requirements of Title IV-D of the Act.  OCSE proposed a one
percent reduction of the amount otherwise payable to the State for Aid
to Families with Dependent Children (AFDC) during the period July 1,
1987 through June 30, 1988 (a $746,477 reduction).

The State challenged the jurisdiction of the Departmental Appeals Board
to hear this appeal, the conduct of OCSE during this proceeding, the
regulations that governed Title IV-D audits, and the statistical
sampling methodology used by OCSE as a basis for its findings.  The
State did not challenge any of OCSE's findings that the State had not
provided establishment of paternity services in specific cases reviewed
by OCSE.  In fact, the State admitted early in this proceeding that its
performance did not meet the regulatory standard for substantial
compliance.  Tape of 2/15/89 Conference Call.  Resolution of this appeal
was delayed significantly by settlement negotiations between the parties
and by OCSE's adjustments to its statistical methodology late in the
proceeding.

For the reasons stated below, we uphold OCSE's decision to reduce by one
percent the State's AFDC funding for the one-year period beginning July
1, 1987.  Specifically, we conclude that--

o       the Board has jurisdiction to review, reconsider and decide the
State's challenge to this disallowance;

o       none of the allegedly prejudicial OCSE actions during the
disallowance and appeal process harmed the State, and none provide a
basis for overturning this disallowance;

o       OCSE properly applied its interpretation of the statutory term
"substantial compliance" to the time periods at issue here;

o       OCSE reasonably interpreted the statutory requirement for
"substantial compliance" to mean that a state must be taking action to
provide basic child support services (required under the Act) in at
least 75% of the cases requiring those services;

o       OCSE's adoption of a regulation specifying a one-year limit on
the period permitted by the statute for corrective action is not
arbitrary, capricious, or contrary to the purpose of the statute; and

o  the statistical sampling evidence submitted here reliably shows that
the State failed to meet the OCSE audit criterion related to
establishing paternity in Title IV-D cases.

Statutory and regulatory provisions

Each state that operates an AFDC program under Title IV-A of the Act is
required to have a child support enforcement and paternity establishment
program under Title IV-D of the Act.  Section 402(a)(27) of the Act.
The Title IV-D program has been in existence since July 1975.  OCSE has
the responsibility for auditing state Title IV-D programs, pursuant to
section 452(a)(4) of the Act, and evaluating whether the actual
operation of such programs conforms to statutory and regulatory
requirements.  Following adoption of Title IV-D, the participating
states were given 18 months by Congress -- until December 31, 1976 -- to
establish and begin operating their programs before compliance audits
actually began.  Under the applicable statute, a state was subject to a
five percent reduction of its Title IV-A funds if the audit found that
the state was not in compliance.  Congress, however, continuously
extended the initial moratorium on imposition of this funding reduction,
so that no reduction was ever imposed during the first eight years of
the program's operation, although OCSE did continue its annual audits.

On August 16, 1984, Congress adopted the Child Support Enforcement
Amendments of 1984, section 9 of Public Law 98-378 (the 1984
Amendments).  As amended, section 403(h)(1) of the Act provides that--

 if a State's program operated under Part D is found as a result
 of a review conducted under section 452(a)(4) not to have
 complied substantially with the requirements of such part for
 any quarter beginning after September 30, 1983, and the
 Secretary determines that the State's program is not complying
 substantially with such requirements . . ., the amounts
 otherwise payable to the State under this part [A] for such
 quarter and each subsequent quarter, prior to the first quarter
 throughout which the State program is found to be in substantial
 compliance with such requirements, shall be reduced . . . .

(Emphasis added.)

The amended section then provides for graduated reductions, starting
with a reduction of "not less than one nor more than two percent" and
increasing to a maximum of five percent with each consecutive finding
that a state is not complying substantially with Title IV-D
requirements.

The 1984 Amendments provided for the continuation of compliance audits,
which could in appropriate cases be scheduled as infrequently as once
every three years. 2/  Rather than directing immediate reduction of
funding for a state which failed an audit, the Amendments provided that
a reduction could be suspended while the state was given an opportunity
to bring itself into compliance through a corrective action plan
approved by OCSE.  Section 403(h)(2)(A)-(C) of the Act, as amended.  If
a follow-up review of a state's performance showed that the state still
did not achieve substantial compliance, a reduction would be imposed.
Section 403(h)(2)(B)(iii) of the Act.

Section 9(c) of the 1984 Amendments provides that they "shall be
effective on and after October 1, 1983."

OCSE proposed regulations implementing the Amendments on October 5,
1984, 49 Fed. Reg. 39488 (1984), and issued final regulations on October
1, 1985.  50 Fed. Reg. 40120 (1985).  (We refer to these regulations as
the "1985 regulations.")  The 1985 regulations amended parts, but not
all, of the audit regulations at 45 C.F.R. Part 305.  Section 305.20(a),
as amended by the 1985 regulations, provided that, for the fiscal year
(FY) 1984 audit period, certain listed audit criteria (related primarily
to administrative or fiscal matters) "must be met."  This section also
provided that the procedures required by nine audit criteria "must be
used in 75 percent of the cases reviewed for each criterion . . . ."
These criteria relate to performance of basic services provided under a
Title IV-D state plan; one of these is the criterion at issue in this
appeal.  All the service-related audit criteria are based on sections of
45 C.F.R. Part 305 which (with minor exceptions not relevant here) were
originally published in 1976, with minor amendments in 1982.  (We refer
to these provisions, as amended in 1982, as the "existing regulations"
since they were in effect during FY 1984.)

Thus, under the 1985 regulations, substantial compliance for FY 1984
audits was measured by audit criteria from the existing regulations, but
a state had to be providing the required services in 75% of the cases
requiring them. 3/  In follow-up reviews after a corrective action
period (limited by regulation to one year, 45 C.F.R. 305.99(c)), OCSE
would examine only the audit criteria that the state had previously
failed or had complied with only marginally (that is, in 75 to 80% of
the cases reviewed for that criterion).  45 C.F.R. 305.10(b) and 305.99,
as amended.

Background

OCSE's audit for FY 1984 (October 1, 1983 through September 30, 1984)
resulted in a September 24, 1986 notice to the State that it had been
found to have failed to comply "substantially with the requirements of
Title IV-D of the Act" in the following areas:  (1) reports and
maintenance of records, 45 C.F.R. 305.35(a), and (2) establishing
paternity, 45 C.F.R. 305.24(c).  9/24/86 Letter to State from OCSE
Director (State Exhibit (Ex.) N). 4/  OCSE found that 290 of 811 case
files it initially selected could not be located, and that the State
took action in only 62 of 131 sample cases requiring action to establish
paternity (47% of sampled cases). 5/

Rather than appealing OCSE's findings, the State opted to propose a
corrective action plan (State Ex. O) that was accepted by OCSE on
December 18, 1986, and the funding reduction was suspended.  12/2/88
Disallowance letter at 1.  The corrective action period ended on
September 23, 1987.  The follow-up review by OCSE of the State's
performance for calendar year 1987 resulted in the December 2, 1988
notice of a proposed funding reduction that is the subject of this
appeal.  OCSE found that the State had achieved substantial compliance
with the reports and maintenance of records criterion, but had failed
the establishing paternity criterion.  Specifically, OCSE found that the
State's performance had deteriorated during the follow-up review period
-- the State had taken appropriate action in paternity cases in only 20
of 71 sample cases (28% of those cases).  See State Ex. P.

The State filed an appeal of the disallowance that contained numerous,
often duplicative arguments.  In this appeal proceeding, the State
styled these arguments as "Proposition I," "Proposition II," etc. 6/  In
our analysis below, we review each of these propositions (which we will
identify in parentheses), but do not necessarily follow the order in
which the State raised them. 7/

Analysis

I.  The State's procedural challenges to this proceeding are without
merit.

In its notice of appeal, the State argued that the regulations at 45
C.F.R. 205.146(e) and 305.100(g), which refer to "reconsideration" of a
penalty disallowance, provide the State with the right to a
reconsideration by the entity which issued the disallowance, prior to
review by the Board.  The State therefore contended that the Board
should remand this case to OCSE for such a  reconsideration.
Notwithstanding the Board Chair's January 5, 1989 ruling that the case
was properly before the Board, the State continued to press this
argument in its briefs (Proposition I).  Despite numerous subsequent
reaffirmations of this ruling throughout this proceeding (see 8/2/89 and
1/23/90 rulings), we briefly discuss it here.

As the Board Chair noted in his ruling, this disallowance was issued by
the top official of OCSE, who expressly directed the State to appeal to
the Board.  Consequently, absent a rule requiring such a review, it
would have been unreasonable for the Board to remand this matter to that
official for "reconsideration." 8/  Moreover, both regulations cited by
the State refer back to section 1116(d) of the Act, which encompasses
reconsiderations that have been delegated to the Board in 45 C.F.R. Part
16, Appendix A.  See 1/5/89 Ruling at 2.  The Board Chair therefore
concluded that the State would be afforded its full right to
reconsideration through an appeal to the Board.

The State argued in its appeal brief that this jurisdictional ruling is
incorrect, citing to Words and Phrases that "As normally used in context
of administrative adjudication `reconsideration' implies reexamination,
. . . by the entity which initially decided . . . ."  36A Words and
Phrases, "Reconsideration" (West Supp. 1991).  Section 1116(d), however
refers to reconsideration by the Secretary, HHS.  The Secretary has
clearly delegated this Board authority to reconsider such disallowances,
on his behalf.  45 C.F.R. Part 16, App. A, section B; 43 Fed. Reg. 9264
(March 6, 1978).  The fact that the Secretary delegated the authority to
make an initial decision to OCSE does not mean that review by this Board
is any less a reconsideration by the Secretary or a decision by the
Secretary. 9/

The State has been afforded its full right to reconsideration through
this two-and-a-half year long proceeding; it can show no harm to its
interests after having had a full and fair opportunity to present its
appeal to this Board.  We therefore reaffirm that this appeal is within
the Board's jurisdiction. 10/

The State also complained about OCSE's introduction late in this
proceeding of a correction to its statistical calculations concerning
the data underlying OCSE's finding for the FY 1984 audit that the State
failed to meet the establishing paternity criterion (Proposition XIV).
The State variously argued that:  this was an attempt by OCSE to reap
the benefits of a remand for reconsideration while opposing an actual
remand (10/12/90 br. at Proposition I); was a violation of "fundamental
fairness" to the State (10/12/90 br. at Proposition III); was an
improper attempt to issue new findings at the close of the briefing
process (10/12/90 br. at  Proposition XIV); and effectively rendered the
original penalty disallowance's calculations "void" (10/12/90 br. at
Proposition XV).

We disagree with the State that there was anything untoward or unfair
about OCSE's proffered corrections.  (We note that the State did not
challenge them procedurally until well after they were made, presumably
because there was a chance that the end result might have been favorable
to the State.)  OCSE's changes in this case involved only calculations
concerning data already in the record; they did not involve new evidence
or a new theory of liability.  While of course it would have been easier
for all concerned if these recalculations were not necessary, so long as
the State had sufficient notice to enable it to understand the issues
and OCSE's position, the disallowance letter was adequate for us to
proceed.  See 45 C.F.R. 74.304(c).  The State was given ample
opportunity, including two in-person hearings, to review and comment
thoroughly upon these changes, so that "fundamental fairness" was
clearly provided.  Board procedures give the Board considerable
discretion to enable the parties to clarify or amend their positions as
long as the opposing party has the opportunity to respond to any change.
West Central Wisconsin Community Action Agency, DAB No. 861 (1987)  "In
describing the disallowance notice that gives rise to Board proceedings,
the regulations nowhere suggest that the Board must dismiss a
disallowance merely because the Agency wished to modify the disallowance
rationale based on a changed understanding of the circumstances of the
case."  Id. at n.2.  Since the State was given a full opportunity to
challenge the changes, we conclude that it was not harmed by their
introduction.

The State also argued in its notice of appeal that the Board should
review OCSE's conditional stay of the penalty's imposition because the
State was allegedly thereby denied the full 30 days permitted by
regulation for filing its notice of appeal.  The Chair's 1/5/89 ruling
noted that the practical effect of OCSE's action was to influence the
State to file its notice four days early.  Moreover, no harm was done,
since the regulations contemplate only a rather summary statement of an
appellant's grounds for appeal at the notice of appeal stage (45 C.F.R.
16.7(a)), and the Board gave the State several generous extensions of
time in which to prepare its appeal brief.  See 5/26/89 Letter to the
State.  att. 1.  Ultimately, the State was given six months in which to
prepare its appeal brief.

The State continued to challenge OCSE's actions in subsequent briefs,
asking the Board to reconsider its ruling because it set a precedent
that would allow OCSE to require an appellant to file a notice of appeal
"in a few hours."  Appeal br. at 9 (Proposition II).  We fail to see how
this could set a precedent, however, when the Board's regulations
clearly allow a 30-day period, and OCSE's actions cannot override the
regulations.  45 C.F.R. 16.3(b); see also 45 C.F.R. 74.304(d).  The
State did not allege that it was harmed, moreover, and the remedy it
sought from the Board was a remonstration against OCSE, not the reversal
of this disallowance.  The State also requested an amendment to the
Board's regulations to prohibit such actions (reply br. at 19); however,
the means for requesting a rulemaking are a petition under 5 U.S.C.
553(e), not an appeal before the Board. 11/  Consequently, we conclude
that the Chair's 1/5/89 ruling that the State was not harmed is still
valid and we affirm it.

The State also maintained that it was disadvantaged during this
proceeding by the nondisclosure agreement that OCSE required the State's
counsel to sign prior to reviewing the audit workpapers (Proposition
XI).  That agreement provided:

 SAFEGUARDING INFORMATION AND DISCLOSURE RESTRICTIONS

 The use or disclosure of information concerning applicants or
 recipients of support enforcement services must be safeguarded
 in accordance with 45 C.F.R. 303.21 and may not be used for any
 unapproved purpose or be redisclosed to any other Federal, State
 or local body.

 Unauthorized use or disclosure of information obtained through
 the Federal Parent Locator Service may result in civil or
 criminal penalties under 26 U.S.C. 7213(a)(2).

 Tax information must be safeguarded in accordance with 26 U.S.C.
 6103(p)(4).

 I understand the above noted restrictions and affirm that I am
 authorized as an official of my State to have access to, review
 and transmit copies of information, and the State is bound to
 safeguard and protect against wrongful use.  I agree to abide by
 all applicable Federal laws and regulations, and I expressly
 agree that any information disclosed to me which may be
 protected under the Privacy Act, the above cited requirements or
 any other law or Federal regulation will not be redisclosed
 except to authorized officials of my State.

Att. to State's 6/30/89 Letter to Board.

The State interpreted this agreement to mean that the State was unable
to share the contents of these workpapers with an outside consultant,
with other states, or even with the Board during this proceeding.  OCSE
explained that this agreement was standard and was meant to protect
client-specific information.  Thus, all the State had to do to satisfy
the legal requirements of the Privacy Act was to remove identifying
information from any documents submitted to the Board, using a code (or
initials) to identify cases discussed, and to limit sharing of those
documents to those persons with a legitimate need for access who also
sign an agreement to keep this material confidential.  See OCSE 7/6/89
Motion to Compel Mississippi to File a Complete Brief at 3; OCSE 8/25/89
Letter to the Board at 1-2.  In an August 28, 1989 telephone conference
call, the Presiding Board Member discussed this issue with the parties
and specifically ruled that OCSE could not restrict access to the
information and that State counsel could comply with the nondisclosure
agreement by blocking out identifying information.  9/13/89 Summary of
Telephone Conference Call at 2.  At that time, State counsel "stated
that he was specifically waiving any right to be furnished or to inspect
Agency workpapers."  Id.  State's counsel, however, reiterated his
complaints in the State's reply brief (Proposition XI) and at the
hearing.  Transcript of 4/4/91 Hearing ("Tr. I") at 12-41.

We are unable to understand why State counsel clings to this outlandish
reading of the standard disclosure agreement to his client's detriment.
In any event, however, the State clearly was given access to all of the
information it needed (most of which was information from the State's
own files) and given permission to use it to defend this case.  Since
State counsel waived the right to access, we see no reason to discuss
this further.  Consequently, we reject this proposition as a basis for
overturning the disallowance.

The State also raised (as Proposition IV) a contention that the Board
should make an in camera inspection of documents withheld by OCSE in
connection with a Freedom of Information Act (FOIA) request.  Appeal br.
at 12-14.  In addition to access to the audit workpapers, OCSE had
provided hundreds of documents to the State in response to a FOIA
request, but had withheld eight documents containing 33 pages as exempt
from disclosure.  See 9/26/89 Family Support Administration's Response
to Mississippi's Question Regarding Withheld Documents at 1.  The
Presiding Board Member directed OCSE to answer questions about the
contents of those documents (9/13/89 Summary of Telephone Conference
Call), but denied the request for in camera inspection.  See 10/11/89
Summary of Telephone Conference Call; 11/20/89 Summary of Telephone
Conference Call (denying reconsideration of 10/11/89 ruling). 12/  We
agree with the Presiding Board Member's ruling that the documents as
described do not appear to be relevant to the issues raised by the State
in its appeal.  Generally, these documents are predecisional discussions
of proposed statistical sampling plans and calculations; the final
methodology actually used by OCSE is the relevant document for this
proceeding, and it was made available to the State.  Consequently, we
reaffirm the Presiding Board Member's ruling and reject the State's
Proposition IV as a basis for overturning this disallowance.

II.             The State's challenges to the 1985 regulations are
without merit.

The State challenged the 1985 regulations that OCSE used in concluding
that the State was not in substantial compliance.  Specifically, the
State argued that--

o  the regulations are impermissibly retroactive under Bowen v.
Georgetown University Hospital, 488 U.S. 204 (1988) (hereafter
Georgetown), since OCSE lacked express statutory authorization to apply
these regulations retroactively (discussed in Proposition V);

o       despite the statutory provision setting an effective date of
October 1, 1983, the legislative history of the 1984 Amendments
indicates that Congress expected any resulting regulations to be
prospective only in effect (also discussed in Proposition V);

o  the 75% standard as applied to the State here is not avalid
interpretation of the statutory term "substantial compliance" since the
State is being penalized for failing to meet only one criterion
(Proposition X);

o       the 75% standard in the regulations had no empirical basis and
therefore was established in an arbitrary and capricious manner under
Maryland v. Mathews, 415 F. Supp. 1206 (D.D.C. 1976), and according to
the State's expert (9/12/90 Affidavit (Appeal record at 1045))
(Proposition VI); and

o       the regulations were invalid because they did not include a
definition of "violations of a technical nature," based on section
403(h)(3), as amended (Proposition VII).

OCSE disputed the State's position, but also pointed out that the Board
is bound by all applicable laws and regulations under 45 C.F.R. 16.14.
The regulations at issue were "effective" on the date of final
publication (October 1, 1985).  However, section 305.20(a), which sets
out the 75% standard for service-related audit criteria, states that it
is to be applied "[f]or the fiscal year 1984 audit period."  The
preamble to the regulations confirmed that OCSE intended to apply this
section to FY 1984 audits, based on the October 1, 1983 effective date
of the 1984 Amendments.  50 Fed. Reg. at 40126, 40138.

We are, of course, bound by the Department's regulations, even if
invalid under a constitutional analysis, if those regulations are
applicable. 13/  While some of the issues here clearly would be
controlled by 45 C.F.R. 16.14, the State's arguments also raise
interrelated questions of applicability.  We do not need to sort out
these issues precisely, however, since we conclude that all of the
State's arguments concerning the regulations are completely without
merit. 14/  Our reasons are:

o       Section 403(h)(1) of the Act, as amended, requires reductions
for states not found to be in substantial compliance in audits "for any
quarter beginning after September 30, 1983," and Congress explicitly
made the 1984 Amendments effective on October 1, 1983.  The
circumstances here are therefore distinguishable from those in
Georgetown, where the agency published cost-limit rules for Medicare
providers in 1984 and attempted to apply the rules to 1981 costs, in the
absence of any statutory authority to do so.  Here, the statute
expressly made the change in the standard retroactive.

o       In support of its argument that the statutory language setting a
1984 effective date was not an express authorization for retroactive
rulemaking, the State argued that the legislative history of the 1984
Amendments shows that Congress intended that OCSE's implementing
regulations would have prospective effect only.  Reply br. at 25-26.
The legislative history on which the State relied, however, does not
refer to OCSE's implementation of the substantial compliance standard;
instead, it refers to the expectation by Congress that OCSE would issue
new regulations focusing on whether states were effectively attaining
program objectives (in addition to meeting the existing state plan
requirements). 15/  S. REP. No. 387, 98th Cong., 2d Sess. 32-33 (1984).

o  The effect of the 1985 regulations here is also significantly
different from the effect of the cost-limit rules considered in
Georgetown.  There, Medicare providers were entitled to a specific level
of reimbursement under the regulations in effect in 1981, and the 1984
rules would have retroactively reduced that level.  Here, the AFDC
funding reduction applies to periods after the 1985 regulations were
published.

o       The audit criteria at issue here were in the existing
regulations, had been in effect without substantial change since 1976,
and were based on IV-D state plan requirements. 16/  The 75% standard is
more lenient than the standard in the existing regulations, which
provided that the State must "meet" the criteria.  Even if the State is
correct that OCSE could not reasonably have implemented this by
requiring action in 100% of the cases, the existing regulations clearly
contemplated a compliance level greater than 75%. 17/  Certainly the
1984 Amendments evidence a strong intention to finally hold the states
accountable for providing services 18/, and we see no basis in the
language or history to support the State's apparent interpretation that
Congress meant "substantial compliance" to be something less than the
75% standard adopted by OCSE.

o       More important, the 1985 regulations afforded the State a
corrective action period.  The State had notice of the 75% standard
prior to this period, and more than a year to adjust its administrative
practices before the follow-up review period began.

o  The regulations here merely interpret the statutory term "substantial
compliance."  Obviously, the range of compliance levels OCSE could adopt
is limited by this term, particularly when it is read together with
section 403(h)(3) of the Act (which permits a finding of substantial
compliance only when any noncompliance is of a technical nature).  A
level lower than 75% would have been subject to challenge as
inconsistent with statutory intent.

o       Since the 75% standard reasonably interprets the statutory term
"substantial compliance," the circumstances here are distinguishable
from those considered in Maryland, where the court found that
regulations setting "tolerance levels" for AFDC eligibility
determination errors were not reasonably related to the purposes of the
statute.  Moreover, unlike the "tolerance levels" in Maryland, the 75%
standard here had an empirical basis in past performance levels measured
through OCSE's audits.  While audit results from FYs 1980 and 1981
showed that some states were not yet achieving 75% levels, other states
were achieving 100% levels at that time (see appeal record, vol. VII at
936-948), and OCSE could reasonably expect all states to be achieving
75% levels by FY 1984. 19/  Moreover, while the State implied that this
factual basis had to be published as an integral part of the rulemaking
proceeding (State reply br. at 24), OCSE stated in the preamble that
this level was based on its experience with past audits.  50 Fed. Reg.
40121.

o       Even in the absence of the 1985 regulations, we would reject the
State's position that it should be found to meet the substantial
compliance standard because it failed only a single criterion.  The
record here supports a finding that, in a substantial number of the
cases entrusted to it where paternity had not been established, the
State did not take any action to provide this service.  Yet,
establishing paternity was absolutely essential to the overall child
support program under Title IV-D of the Act.  If the State's argument
were accepted, then thousands of children would not receive the
assistance Congress has provided them and the AFDC program would
continue to bear costs that should be borne by the absent parent.  Thus,
we conclude that the State did not achieve substantial compliance under
any reasonable reading of that term.

o       Finally, we reject the State's arguments based on section
403(h)(3) of the Act.  That section permits OCSE to find substantial
compliance only where any noncompliance is "of a technical nature which
does not adversely affect the performance of the child support
enforcement program."  OCSE implemented this provision through its
regulations, determining that failure to meet the critical
service-related audit criteria in its regulations is not simply
technical since the required activities are essential to an effective
program.  50 Fed. Reg. at 40130.  We find that interpretation to be
reasonable as applied here since the State's failures under a
service-related criterion would adversely affect program performance;
the State took no action whatsoever to provide a basic child support
service in a significant number of cases.    20/  For example, the
follow-up reviewers found that in two of the larger counties visited,
average IV-D case worker case load was 3,459 per case worker.  State Ex.
Q, Att. at 5.

Thus, we conclude that application of the 1985 regulations here was
clearly proper, and that those regulations are consistent with the 1984
Amendments.

III.  The State's arguments concerning the validity of the one-year
corrective action period must also be rejected.

The State also argued that OCSE's regulation limiting corrective action
periods to one year was arbitrarily and capriciously established, and
was inconsistent with the intent of the statute (Proposition VIII).
Section 403(h)(2)(A) of the Act, as amended, provides that --

 the reductions required under paragraph (1) shall be suspended
 for any quarter if-- (i) the State submits a corrective action
 plan, within a period prescribed by the Secretary following
 notice of the finding under paragraph (1), which contains steps
 necessary to achieve substantial compliance within a time period
 which the Secretary finds to be appropriate; . . . .

(Emphasis added.)  The Secretary's implementing regulations at 45 C.F.R.
305.99(c) provide that:

 The penalty will be suspended for a period not to exceed one
 year from the date of the notice [of failure to meet the
 substantial compliance standard] if the following conditions are
 met:

     (1)  Within 60 days of the date of the notice, the State
     submits a corrective action plan to the appropriate Regional
     Office which contains a corrective action period not to
     exceed one year from the date of the notice and which
     contains steps necessary to achieve substantial compliance
     with the requirements of title IV-D of the Act; . . . .

(Emphasis added.)  The State argued that OCSE's blanket determination
that one year was an appropriate length of time for a corrective action
period thwarted congressional intent that a state be given sufficient
time to bring itself into compliance.  According to the State, the
legislative history indicated that a state was to be penalized only if
it refused "to undertake the necessary changes to correct that
situation."  S. REP. No. 387, 98th Cong., 2d Sess. 33 (1984).  In
particular, the State argued that necessary program changes requiring
increased expenditures could only be implemented through state
legislative action, and that in its own case the legislature met only
once a year.  Appeal br. at 28-29.

In response to the State's contentions, OCSE argued that its regulation
was reasonable and fully consistent with the statute and legislative
history.  OCSE contended that audits had been conducted beginning with
the second quarter of FY 1977 and the State, through prior audit
reports, "had been repeatedly warned of its program deficiencies and was
admonished to take action to improve IV-D services."  OCSE Br. at 25.
OCSE noted that the State was implementing a reduction in force of its
attorneys at the same time it was developing its corrective action plan.
OCSE Br. at 9; see also State Ex. O at 5.  Thus, the State's failure to
meet the standard during the one-year corrective action period (in fact,
its performance declined) was attributable more to the State's staffing
decisions than to the time available for corrective action.  OCSE
maintained that its regulation was entirely consistent with the
legislative history of the 1984 Amendments.

We agree with OCSE that its conclusion that a one-year period was
appropriate for corrective action was reasonable, particularly since the
state plan requirements at issue here had been in effect as of 1976.
Moreover, most states had had notice of the particular shortcomings of
their programs from earlier years' audits (see, e.g., New Mexico, n. 3).
In fact, the State did not dispute OCSE's assertion that for fiscal
years 1980 and 1981, the State took action to establish paternity in
only 28% and 14% of the applicable cases respectively.  OCSE Br. at 29;
appeal record, vol. VII at 938 and 944.  Although the State argued that
these previous warnings were ineffectual because there was never any
penalty attached, the State at least knew as of April 1986 (17 months
before its corrective action period began in September 1987), that
OCSE's preliminary audit had found the State's paternity establishment
services to be far below the regulatory standard and that a penalty was
all but imminent.  State Ex. I.  Under these circumstances, the State's
reduction in staffing could reasonably be said to be a refusal to
undertake the necessary changes to correct its deficiency.

Given OCSE's responsibility to audit compliance for 54 jurisdictions,
its decision to adopt an outer limit for the appropriate corrective
action period is well within its discretion.  As we noted above, the
1984 Amendments certainly evidence a strong intention to finally hold
the states accountable for providing services.  The Senate Finance
Committee Report for the 1984 Amendments, which the State quoted out of
context, said:

 In view of the changes proposed . . ., the penalty provisions of
 the law will apply only in cases where States not only fail
 substantially to carry out the requirements of the law but also
 refuse to undertake the necessary changes to correct that
 situation.  For this reason, the Committee cannot foresee any
 situation in which legislative action to suspend these revised
 penalties would be appropriate.

S. REP. No. 387, 98th Cong., 2d Sess. 33 (1984).

We therefore reject the State's contention that the regulation is
arbitrary, capricious or contrary to statutory intent in setting a
one-year corrective action period.  We would be bound by it in any
event.

IV.   The State's statistical sampling arguments do not provide a basis
for overturning the funding reduction.

We next turn to the State's arguments about OCSE's statistical sampling
methodologies.  In both the program results audit and the follow-up
review, OCSE evaluated through the use of statistical sampling
techniques the audit criteria which required that the State be using its
procedures in 75% of its Title IV-D cases.  The State did not challenge
the audit findings in individual sample cases, nor did the State
challenge generally the use of statistical sampling methods as a basis
for a decision here.  The State raised two issues, however, with respect
to the specific sampling methods OCSE used.

The first issue related to the fact that OCSE had used in the follow-up
review a method called "systematic random sampling."  The Board has in
other cases upheld the use of this method in the absence of any showing
that drawing the sample cases in a systematic way introduced a bias into
the sample results.  New Mexico at 17-18.  The Board thus asked the
State to address whether there was any basis in the data sampled for the
follow-up review for finding that there was any bias of the type
recognized by statisticians for invalidating a systematic random sample.
1/31/91 Questions to Parties, Question No. 9.  The State failed to
respond to this question, and presented no evidence of such a bias.  The
State raised no other contention regarding the sampling methodology used
in the follow-up review.  Thus, we affirm the findings of that review
without further discussion.

The second sampling issue related to the question of how to evaluate,
using statistically valid methods, the data from the sample used in the
program results audit.  Below, we first discuss the sampling procedures
and results for this audit.  We then explain how the issues have
developed and narrowed during the course of Board proceedings.  We then
discuss the evidence presented and explain why we find that the State
did not achieve the appropriate standard for the establishing paternity
criterion.

  A.  The sampling procedures and results

The following facts are undisputed.

In the program results audit, OCSE drew a stratified, cluster sample.
The State had ten political subdivisions (referred to as either regional
offices or counties).  The sample was divided into 6 strata:  strata 1,
2, and 6 each consisted of one political subdivision; strata 3 and 4
each had two political subdivisions; and stratum 5 had three political
subdivisions.  OCSE randomly drew one political subdivision from each of
the three strata with more than one political subdivision and then drew
a number of Title IV-D cases from that political subdivision, with a
probability proportional to the size of the stratum.

After selecting the sample, OCSE examined each sample case to determine
what action, if any, was required in the case (in other words, which
audit criteria applied).  For example, if the whereabouts of an absent
parent who owed support was unknown, the case would be classified as a
"locate" case, requiring review to see if the State took any action to
locate the absent parent, as required by 45 C.F.R. 305.33(g).  OCSE then
examined the case files and other records to determine whether the State
had, in fact, taken any required action during the relevant time period,
finding either "action" or "no action" for each sample case reviewed.
For the key criterion at issue here, OCSE found that out of 131 cases
requiring action to establish paternity, the State had taken action in
only 62 cases.

OCSE then used the sample findings to calculate an "efficiency rate" and
an "efficiency range" for each criterion.  The "efficiency rate" is the
single most likely estimate of the percentage of cases requiring review
under the audit criterion which were "action" cases.  The "efficiency
range" is what is known in statistical parlance as a "confidence
interval."  A confidence interval is a statistician's calculation of the
range of values within which the statistician can say with a specified
degree of certainty the true value occurs.

Under OCSE's audit procedures, a criterion was considered "unmet" if the
"high range" of the "efficiency range" was less than 75% and only
"marginally met" if the "high range" was 75% to 80%.  The "high range"
figure is also referred to as the "upper limit" or "upper bound" of the
confidence interval.  The effect of OCSE using this figure is that OCSE
assumes the risk associated with potential sampling error.  (In other
words, there is more risk that OCSE will pass a state which in fact
failed than that it will fail a state which in fact passed.)

OCSE calculated the upper limit of the confidence interval by
multiplying the "standard error" of the sample times 1.96, and adding
the result to the efficiency rate.  (As discussed below, 1.96 is a
multiplier which is usually used to give a two-sided 95% confidence
interval.  One of the issues was whether use of this multiplier was
appropriate here.)

In a FY 1984 program results audit, OCSE examines all audit criteria
listed in section 305.20(a) of the 1985 regulations.  In a follow-up
review, however, OCSE examines only those criteria which are either
"unmet" or which are only "marginally met" in the program results audit.
A state is subject to a penalty after a corrective action period if the
state fails in the follow-up review to meet any criterion which was
"unmet" during the original audit period or to maintain compliance with
any criterion which was only "marginally met."  See 45 C.F.R.
305.100(a).

Here, OCSE found that the State failed to meet only one criterion,
"establishing paternity," in the follow-up review.  The State had failed
this criterion and another criterion, "reports and maintenance of
records" under 45 C.F.R. 305.35, in the program results audit.  Thus, if
we find the State either failed to meet or only "marginally met" (75% to
80%) the "establishing paternity" criterion in the program results
audit, we would find that OCSE would have properly included that
criterion in the follow-up review.  This is important because the
State's evidence on the statistical sampling method presumed that the
State only had to meet a 75% standard.  In fact, as we pointed out to
the State prior to the hearing (1/15/91 Notice of Hearing at 3-4), if
the State achieved only 80%, it still would have been subject to review
for this criterion in the follow-up review, and a penalty would be
appropriate since the State failed this criterion in the follow-up
review (achieving only 28% as a raw score, leading the reviewers to
conclude with 99.9% confidence that the 75% standard had not been
achieved.  See State Ex. P, att. at 4.)  See 45 C.F.R. 305.100.

  B.  How the issues developed

The State did not contest OCSE's findings in individual sample cases
from the program results audit.  Moreover, the State did not initially
challenge the sampling methodology.  As a result of developments in the
related Title IV-D cases, however, OCSE recognized that it had
calculated the "efficiency rate" and "efficiency range" for the
"establishing paternity" criterion as though a simple random sample had
been drawn, rather than the stratified, cluster sample that was in fact
drawn.  In an effort to correct this mistake, OCSE developed several
alternative methodologies, partly in response to comments by a
statistical sampling expert who had appeared for several other states
with similar cases and who ultimately appeared as a witness for
Mississippi.  (OCSE has stipulated that this witness qualifies as an
expert, and we refer to him as "the State's expert.")

Under any of OCSE's methodologies, the State failed to meet the 75%
standard (and therefore did not have more than 80% compliance).  See
OCSE 11/21/90 Submission, att. at Ex. 3.

The percentage figure in question here may be expressed as a ratio of
the number of cases in which action was taken to establish paternity to
the number of cases in which such action was required.  It is undisputed
that there are two generally accepted methods for estimating a ratio
from the results of a stratified sample and calculating the confidence
interval associated with it.  The "separate ratio estimator" method uses
the ratios estimated for each stratum and then calculates an overall
estimate by weighing these ratios according to the relevant strata
population.  The "combined ratio estimator" method pools the total
number of cases where action was taken; pools the total number of cases
where action was required; and takes an average of the ratio of these
numbers.  Tr. I at 63; 1 M. Hansen, Stratified Simple Random Sampling
189 (1953 ed.).

The State's expert had originally suggested use of the separate ratio
estimator method (see Tr. I at 88), and one of OCSE's methodologies
(Methodology #3) is based on this method.  In using this method, OCSE
"collapsed" each of the strata with more than one political subdivision
(strata 3, 4, and 5) in order to calculate what is called a "within
strata variance." 21/  (The term "variance" is used for the square of
the "standard error."  The formulas at issue here calculate the
variance, then take the square root in order to obtain the "standard
error," which is necessary in order to calculate the confidence
interval.)  The State's expert then raised a number of questions about
OCSE Methodology #3.  While he had acknowledged in other cases that OCSE
was "moving in the right direction" with this method, the State's expert
said further refinements were needed because the particular sample drawn
here had complications not addressed in the statistical treatises.  He
had also expressed his opinion that until a "correct analysis" was done,
one could not say with a 95% degree of certainty that the State had
failed.  Ohio at 15; see State 12/4/90 submission, att.

Rather than having a series of counter affidavits from the parties'
experts, as we had in some of the related cases, the Board set a hearing
here, limited to the sampling issues.  In its notice of hearing, the
Board specifically noted that the State would have a burden to do more
than challenge OCSE's method, given that OCSE had presented an expert's
affidavit asserting it was reliable; the Board clearly indicated that
the State should present what it considered a "correct analysis" or the
Board would presume that, even with modifications to OCSE Methodology
#3, the State would not achieve the correct standard (which we noted was
not 75% but over 80%).  1/15/91 Notice of Hearing at 2-3.

At the hearing, the State's expert did not present any alternative
methodology.  Thus, the State failed to show that there was any
statistically valid analysis of the sample data which would indicate
that the State had passed the review.  The State's expert did, however,
explain more fully what his concerns were about the use, without
refinements for the particular sample here, of the "separate ratio
estimator" method.  OCSE presented some evidence that statisticians
would ordinarily rely on use of Methodology #3, as well as some rebuttal
to the concerns expressed by the State's expert regarding the "separate
ratio estimator" as applied here.

The validity of OCSE's Methodology #3 largely became a moot question in
this proceeding, however. 22/  At the hearing, OCSE also presented an
opinion, by a highly qualified expert (OCSE's expert), that use of a
"combined ratio estimator" was more appropriate here than the "separate
ratio estimator" (Tr. I at 113-115), and the State's expert ultimately
agreed with this opinion.  Transcript of 6/18/91 Hearing (Tr. II) at 20.
23/  The results of using the "combined ratio estimator" were presented
as OCSE Methodology #4.  The State asked for and was given time to have
its expert more carefully examine OCSE Methodology #4, and we continued
the hearing on a later date.

Before the continued hearing, OCSE made a slight modification to its
calculations to address one problem raised by the State's expert
regarding Methodology #4.  The State's expert agreed that this
modification eliminated this concern.  Tr. II at 22.  He also stated
that he did not question the mathematical accuracy of OCSE's
recalculations.  Tr. II at 20.  The recalculated "efficiency rate" for
the "establishing paternity" criterion is 48.9% and the recalculated
upper limit of the confidence interval is 69.4%.  OCSE 5/23/91
submission, att. 2 (Tepping comments) at 1. 24/

The remaining concerns expressed by the State's expert all went to the
issue of whether an "exact" two-sided 95% confidence interval had been
attained in Methodology #4, as modified.  Tr. II at 22-23.  The State's
expert offered no further modifications to the methodology which would
show that the State had achieved 75% (much less over 80%, as required
here), nor did he express any opinion that the State had achieved 75%.
OCSE's expert, on the other hand, expressed his professional opinion
that it was "extremely unlikely" that Mississippi did have a compliance
rate of 75% or higher.  Tr. I at 147; Tr. II at 51.  This evidence
would, by itself, be a sufficient basis on which we could uphold the
penalty since (1) OCSE's expert was highly qualified in dealing with
complicated statistical data (see OCSE 3/13/91 letter, att.); (2) we
found him to be both knowledgeable and credible; and (3) while the
State's expert was also highly qualified (see appeal record, Vol. IX at
1057-1067), he did not appear as experienced in such complicated
sampling situations and, in any event, expressed no opinion to the
contrary.

We nonetheless proceed to address the concerns expressed by the State's
expert, because we find that they are based on an erroneous premise and
that OCSE's evidence rebutted those concerns.  Thus, our findings are
based not only on OCSE's expert's opinion that the State did not achieve
the 75% standard, but on the evidence taken as a whole.

C.  Analysis

  1.  OCSE did not adopt an "exact" confidence interval.

The concerns expressed by the State's expert were all premised on the
State's view that OCSE had adopted a rule that it would pass a state
unless an "exact" 95% confidence interval could be calculated showing
that the state had not achieved the 75% standard.  Tr. II at 22- 23.  In
Ohio, we noted that OCSE had presented evidence that statistical
validity does not depend on use of a 95% confidence interval.  We
further said that it would appear to be impermissibly arbitrary for OCSE
(once having adopted the 95% level) to use it for some states and not
for others.  Ohio at 17, n. 8.  The State's position that OCSE had a
rule requiring use of an exact 95% confidence interval was partly based
on this statement in Ohio.

In Ohio, however, it was undisputed that the upper limit of a 95%
confidence interval would be obtained by multiplying the "standard
error" by 1.96 and adding that amount to the efficiency rate.  Ohio at
10.  The Board had concluded that OCSE had adopted use of the 95%
confidence interval from OCSE's Program Results Audit Guide.  That
Guide, however, does not specifically say that OCSE would use the upper
limit of a 95% confidence interval.  Rather, it simply sets out a
formula for calculating the confidence interval which uses the
multiplier 1.96.  Appeal record, Vol. IV at 607.  The Board equated this
with the 95% confidence interval because the experts in the Ohio
proceeding did.

The State's expert said here, however, that the multiplier 1.96 does not
always give an exact 95% confidence interval.  He pointed out that some
courts had required a 95% degree of confidence in order to find sample
results reliable.  He did not dispute OCSE's expert testimony (see Tr.
II at 49) that 1.96 is a standard amount generally used for samples with
a size of more than 30 units, and is a multiplier which gives a
"nominal" 95% confidence interval.  The State's expert said, however,
that he had found one article where the authors suggested a multiplier
of 2.00 even though the sample size was more than 500 units.  When the
Board pointed out that multiplying the standard error by 2.00 rather
than 1.96 would not make a difference here (the upper limit of the
confidence interval would still be less than 75%), the State's expert
said that the situation in the article he had referred to was different
from that at issue here since the article was discussing a simple random
sample rather than a stratified sample.  Tr. II at 35.

In any event, we find that OCSE's use of the multiplier 1.96 to achieve
a nominal 95% confidence interval is reasonable here.  First, OCSE's
expert testified that it is impossible to get an "exact" figure, as the
State's expert was demanding (Tr. II at 49), and the State's expert
essentially conceded this.  Tr. II at 26.  Use of the 1.96 multiplier is
"conventional" and statisticians would normally rely on the nominal
confidence interval obtained.  Tr. II at 54.  Indeed, OCSE's policy was
to multiply the "standard error" by 1.96 and even the State's expert had
initially assumed that this was done to establish a 95% confidence
interval.

Moreover, the 1.96 is the standard multiplier ordinarily used for a 95%
confidence interval which is two-sided.  What this means is that there
is only 2.5% probability that the true value is greater than the upper
limit.  (This is graphically illustrated in State's figures 1 and 2
discussed at the continued hearing.  State's Hearing Ex. 4, figure 1.)
In other words, there is 97.5% certainty that the true value is not
greater than the calculated upper limit.  In light of this, OCSE could
reasonably adopt the 1.96 multiplier, even if it might not in every
instance achieve an exact two-sided 95% confidence interval.

We also note that the State's expert's concern with use of the 1.96 was
partly based on his concern about the sample size here.  This concern
was based on the fact that not all of the sample cases required review
for the criterion at issue here, and that, in some strata, the number of
cases requiring review was small.  (For a graphic illustration of the
breakdown of the sample, see State's Hearing Ex. 1.)  OCSE's expert
testified that, in the combined ratio estimator method, the effective
sample size for calculating both the numerator and denominator of the
ratio was the total sample size of 501.  Tr. I at 120-121.  This is
consistent with the formulas, and the State's expert did not directly
dispute it. 25/

Finally, the State's expert's opinion that use of the 1.96 might not be
appropriate was partly based on his concern about whether there was a
"normal distribution" here.  Tr. II at 13.  As we discuss next, OCSE
showed that this concern was not warranted.

Thus, we conclude that OCSE was not required to determine an "exact"
confidence interval and that, in any event, use of the 1.96 multiplier
was appropriate.

  2.  The distribution is not a concern.

OCSE's expert acknowledged that, for the 1.96 to produce an exact
confidence interval, one would have to know, among other things, that
the estimated efficiency rate has a "normal distribution" over all
possible samples.  Tr. II at 49.  OCSE's expert presented an analysis
showing that the distribution here was in fact close enough to normal to
act as if it were.  OCSE Hearing Ex. D; see also Tr. II at 64-65.

The State's expert did not question the validity of this analysis, and
indeed acknowledged that it appeared from this analysis that the
efficiency rates were normally distributed.  Tr. II at 72.  The State's
expert said, however, that this evidence was undercut by the results of
some "simulation" experiments OCSE's expert had performed.  As we
discuss next, we find no merit to this argument, and therefore conclude
that the State's expert's concern about the distribution does not
indicate that the results of Methodology #4 are unreliable.

  E.  The simulation experiments support the result.

In order to verify the validity of the methodology here, OCSE's expert
conducted some simulation experiments.  He described these experiments
as follows:

 Each experiment consisted of assuming that the true efficiency
 rate is 75% and assuming varying efficiency rates for primary
 sampling units in the several strata.  Having defined such a
 hypothetical population, a sample using the sample sizes
 actually used was drawn and the estimated efficiency rate and
 the upper bound were computed.  I repeated the process 1000
 times for the assumed population.  In addition I defined two
 other hypothetical populations with a 75% efficiency rate and
 drew 1000 replicates of the sample from each.  I found that, for
 one population, only 33 of the 1000 replicates produced upper
 bounds below 75% and that, for each of the other two
 hypothetical populations, only 44 of the 1000 replicates
 produced upper bounds below 75%. . . .  Moreover, in no instance
 of the 3000 replicates was the upper bound as low as that
 calculated from the Mississippi audit survey for the Paternity
 criterion.

OCSE 5/23/91 submission, 2d att. at 2.

The State's expert noted that, if there were normal distribution, one
would expect that there is only a 2.5% probability that the true value
is greater than the upper limit calculated from the sample.  He said
that the simulation experiments undercut this assumption because they
showed either a 3.3% probability (33 out of 1000 replicates) or a 4.4%
probability (44 out of 1000 replicates).

The assumption of a normal distribution here means that we are saying
that there is only a 2.5% probability that the State achieved a rate
higher than the upper limit of 69.4%.  The results of the simulation
experiments show only that the methods of analysis used by OCSE could
result in nominal upper limits lower than 75% (even if that were the
true value) as many as 44 out of 1000 times (or, 4.4% of the time).
Thus, we do not view the simulation results as undercutting the
assumption that there is only a 2.5% probability that the State in fact
achieved higher than 69.4%.  Even if the State's expert is correct that
the simulation results somehow indicate the distribution is not normal,
other evidence admittedly does indicate such a distribution, and the
State's expert merely said the simulation results raised a concern about
this.  He did not say that the simulation definitively resolved this
issue.

Moreover, OCSE's expert testified that the significant result of his
simulations was that in no instance in all 3000 replicates was the upper
limit as low as 69.4% and that, in his opinion, this made it "extremely
unlikely" that the State in fact achieved 75%.  Tr. II at 47.  The
State's expert simply said that he could express no opinion on this.
Tr. II, pp. 76-77.

Thus, we conclude that the simulation experiments support a finding that
the State did not achieve 75%, rather than undercutting the result here.

  3.  The other concerns also lack merit.

Another concern about the combined ratio estimator which the State's
expert had expressed had to do with whether Methodology #4 sufficiently
took account of the "within strata" variances.  Subsequently, he dropped
this as a "substantial concern."  Appeal record, Vol. XI at 1072.

We fail to see how it is a concern at all.  OCSE's expert responded that
the variance calculated not only contains a "within strata" component
but, in addition, there is a variance contribution from the "collapsed"
stratum (original strata 3, 4, and 5).  Thus, he said that the estimator
is expected to provide an overestimate of the true variance (which would
favor the State).  OCSE 5/23/91 submission, att. 2; Tr. I at 116,
128-129.  The State's expert agreed that, looked at mathematically, the
formula tends to overestimate.  Tr. II at 36-37.

The State's expert also noted that in calculating the variance,
Methodology #4 used an approximation.  OCSE's expert testified that he
had used what is called the "Taylor's series" approximation and that, in
his view, this was a "pretty good" approximation.  Tr. II at 50.  The
State's expert agreed that any error in this approximation "in and of
itself, was tolerable."  Tr. II at 40.

He then explained that his concern was that more than one approximation
was being used here and that he was not sure of the cumulative effect.
He did not dispute OCSE's expert's assertion (Tr. II at 50), however,
that the other approximations were both phases of the same
approximation, depending on the assumption that we have a normal
distribution and that the numbers would act as if we had drawn a simple
random sample from a normal distribution.  As discussed above, there is
evidence that these assumptions are warranted here.

Moreover, the State's expert agreed that it was a matter of judgment
whether to tolerate any approximations in calculating the confidence
interval, but suggested a conservative outlook given the consequences
for the State here.  Tr. II at 26.  In our view, OCSE did take a very
conservative approach on the whole here and any error in calculating the
upper limit most likely favored the State (due to the overestimation of
the variance from collapsing the strata and the bias in the estimator in
favor of the State).  In any event, we think any such error would be
immaterial.  The upper limit OCSE relied on was 69.4%.  The State would
have had to achieve 75% to meet the criterion and over 80% to have more
than marginally met the criterion, and there is absolutely no evidence
which would support an affirmative conclusion here that the State
achieved either level.

In sum, we find that the overwhelming weight of the evidence here is
that the State did not achieve a 75% standard for the "establishing
paternity" criterion in the program results audit period.  In our view,
we can say this with at least a 95% degree of certainty, and can say
with an even higher degree of certainty that the State did not achieve
over 80%, which it would have to achieve in order not to have this
criterion subject to review after the corrective action period. 26/
Since the State also failed to achieve 75% in the follow-up review, OCSE
properly imposed a funding reduction.

Conclusion

For the reasons stated above, we uphold OCSE's decision to reduce by one
percent the State's AFDC funding for the one-year period beginning July
1, 1987.


 _____________________________ Judith A. Ballard


 _____________________________ Donald F. Garrett

 _____________________________ Alexander G. Teitz Presiding Board
 Member. 1.   The Mississippi agency responsible for the Title
IV-D program changed its name from Department of Public Welfare to
Department of Human Services while this appeal was pending.

2.   The State alleged that the statute authorized OCSE to do an annual
audit only if a previous penalty had been imposed.  See, e.g., Reply
brief (br.) at 5-6.  This is erroneous -- the statute provides for
audits to be conducted "not less often" than once every three years,
unless there has been a penalty or corrective action period implemented,
in which case an annual audit is required.  Section 452(a)(4) of the
Act.  The purpose of the provision was to give the Secretary more
flexibility in allocating audit resources than had existed before, when
annual audits of every jurisdiction were absolutely required.  See
Section 452(a)(4) of the Act (before the 1984 Amendments).  Thus, OCSE
had discretion to audit this State for FY 1984.

3.   The State noted that in its October 1984 notice of proposed
rulemaking OCSE had assured states that it would continue to apply
current audit regulations for fiscal years beginning prior to FY 1985;
the State argued that that assurance was violated by allegedly "adding"
criteria.  Initial br. at 30 (Proposition IX).  As is obvious from the
text, we do not agree with this view and, since the State did not
explain its interpretation after having received notice of our contrary
interpretation from other Title IV-D decisions (see n. 14 below), we do
not discuss it further here.

4.   This was not the State's first notice of problems with its program.
OCSE began compliance audits in 1977, and the State did not dispute
OCSE's claim that the State "had been repeatedly warned of its program
deficiencies and was admonished to take action to improve Title IV-D
services."  OCSE Br. at 25.  The State also did not dispute that in FYs
1980 and 1981 its rough scores for the establishing paternity criterion
were 28% and 14% respectively.  See Appeal record vol. VII at 938, 944.
No penalty was previously imposed, however, due to congressional
moratoria.

5.   We note that the auditors examined whether any efforts were made by
the State in these cases to furnish these services.  The success of
these efforts, while noted for statistical purposes, was not
determinative as to whether the State was found to be in substantial
compliance; the State received credit for "action" so long as it took
some action towards these goals.  See 50 Fed. Reg. at 40132.

6.   We are obliged to adopt the State's labelling of its arguments
since it failed to number the pages in several of its briefs.  See,
e.g., 1/23/90 Ruling at 2.

7.   Proposition XII stated that an evidentiary hearing should be held
in this case.  The State ultimately withdrew this request (10/12/90 br.
at Proposition XII), but then the Board convened a hearing on its own
motion to hear testimony on the statistical calculation issue discussed
in Section IV of this decision (see Board's 12/6/90 and 1/15/91 letters
to parties).

8.   Furthermore, even though the State had filed an appeal, OCSE
retained the option of settling the case, i.e., reconsidering pursuit of
this matter.  In fact, the record shows that the Board gave the parties
several extensions of time before requiring briefing in order to permit
them to pursue possible settlement.  See, e.g., 4/27/89 Letter to
Parties.

9.   The State also raised for the first time in its reply brief the
question whether this proceeding was a "plan conformity matter" rather
than a disallowance case, in which case it should be remanded to OCSE
under regulations applicable to such issues.  See 45 C.F.R. 201.6.
Introducing this issue violated the Presiding Board Member's 2/17/89 and
6/12/89 rulings that, since the State was given an extraordinarily long
time to prepare its initial brief, all arguments not raised in that
brief were waived.  In any event, the issue here is not whether the
State's child support enforcement plan met federal requirements, but
whether its implementation of that plan was so faulty that a penalty
disallowance should be imposed.

10.   The State asked for en banc reconsideration of this issue.  The
Board's regulations clearly put jurisdictional questions under the
authority of the Chair.  See 45 C.F.R. 16.7(b) and Part 16, App. A,
section G.  There is no provision in the Board's regulations for en banc
reconsideration.  The three Board members assigned to this case agree
with the Chair's ruling.

11.   The State asked for similar remedies under its  Proposition III,
in which it complained of a violation of "due process" due to OCSE's
notification after the disallowance letter that interest would be
charged.  (We had previously noted to the State (in our 1/23/90 Ruling
at n.2) that "due process" per se does not apply to a state.  South
Carolina v. Katzenbach, 383 U.S. 301 (1966)).  The interest dispute was
evidently settled (appeal record, Vol. VII at 956-959), however, so we
have no controversy before us to resolve on this point (nor, as
discussed above, do we need to address the State's request for a
regulatory amendment made in this fashion).

12.   The Presiding Board Member noted in his 10/11/89 ruling that the
appropriate forum for the State to pursue its appeal of this FOIA matter
was before a federal court.  The State did file such an appeal, which
was apparently settled by the parties.  Reply br. at 24; State's
10/12/90 br. at Proposition IV.  We do not know the terms of the
settlement, but the State always could have sought and been granted
permission to supplement the record with any documents obtained through
FOIA.  45 C.F.R. 16.13.

13.   The State argued (Proposition V) that the Board had jurisdiction
to apply Georgetown in this case, even though that decision was issued
ten days after the disallowance at issue here.  Appeal br. at 20.  OCSE
did not argue that the fact that Georgetown was issued after the
disallowance precluded us from applying it, and, in any event, since we
find Georgetown distinguishable, the State's argument is not material.

14.   Our conclusion here closely parallels our analysis of virtually
identical arguments made by the parties in Board decisions on other
Title IV-D appeals: Ohio Dept. of Human Services, DAB No. 1202 (1990);
Oklahoma Dept. of Human Services, DAB No. 1223 (1991); New Mexico Human
Services Dept., DAB No. 1224 (1991); District of Columbia Dept. of Human
Services, DAB No. 1228 (1991); Arizona Dept. of Economic Security, DAB
No. 1255 (1991).  Copies of all these decisions were furnished to the
parties in this case for comment on any issues that were applicable.
Neither party filed any such comments.

15.   The 1985 regulations added new performance-related indicators for
use beginning with the FY 1988 audit period.

16.   The State seemed to argue that since there had previously been no
teeth to the statute, due to the moratoria, there was no legal
obligation for the State to improve its performance (Reply br. at 5),
and that the compliance standard was 0%, not 100% (Reply br. at 28-29).
The moratoria only delayed imposition of the penalty, however.  Congress
did not repeal this requirement or guarantee that no penalty would ever
be imposed.  Moreover, the statute at all times required the State to
have in effect a child support enforcement plan,  as a condition for
receiving AFDC payments and as a condition for receiving funding under
Title IV-D.  Sections 402(a)(27) and 455(a) of the Act (1983).

17.   The existing regulations required the states to have and be
utilizing written procedures detailing step by step actions to be taken.
45 C.F.R. 305.1, 305.24; 45 C.F.R. Part 303 (1983).  Although no
reduction had actually been imposed based on the existing audit
criteria, this was due to the moratoria.

18.   The Senate Finance Committee Report stated:

 In view of the changes proposed . . ., the penalty provisions of
 the law will apply only in cases where States not only fail
 substantially to carry out the requirements of the law but also
 refuse to undertake the necessary changes to correct that
 situation.  For this reason, the Committee cannot foresee any
 situation in which legislative action to suspend these revised
 penalties would be appropriate.

S. REP. No. 387, 98th Cong., 2d Sess. 33 (1984).

19.   We note that the percentages given in the draft analyses by OCSE
of 1980 and 1981 audit results (appeal record, Vol. VII at 936-948) are
derived simply by dividing the number of complying sample cases by the
total number reviewed.  If OCSE had instead used the same method for
estimating compliance levels it used in the 1984 and 1985 audits for all
states (see our discussion below), the compliance percentages shown on
the draft analyses for the earlier years would have been higher.
Moreover, in Maryland, the Secretary had acknowledged that some errors
in making eligibility determinations were unavoidable due to the complex
nature of the requirements.  Here, the State did not argue that the
service-related requirements were complex or that there was any barrier
to meeting those requirements which could not be overcome.

20.   The State suggested that, if it were permitted reconsideration by
OCSE, it might show that its violations were of a "technical nature,"
and it speculated on some possible circumstances that would fit that
definition.  Appeal br. at 25.  However, the State never offered any
evidence that any of the cases found to be "no action" in either review
met these hypothetical circumstances.  The State also maintained that
"the mere placing of a piece of paper in the case files pulled in a year
already completed, was a purely mechanical activity in paternity
establishment cases and amounted to failures of a technical nature."
Reply br. at 38.  We are not certain what the State meant by this.  If
the State meant that it had, in fact, taken action in the sample cases
and simply failed to document it in the case files, this is a mere
assertion for which the State provided no support.  The State should
have been aware, moreover, from its reading of the District of Columbia
case, DAB No. 1228, that OCSE did not strictly require a piece of paper
in each file, but has been willing to accept as evidence of action,
reliable computer lists which include a case.  If, on the other hand,
the State meant that OCSE would accept the placing of a piece of paper
in a file as an "action," this is not true, and, in any event, the State
can hardly be heard to argue that failure to do even this much amounted
to violations of a technical nature.

21.   While it would have been better from the standpoint of obtaining a
smaller sampling error if more than one political subdivision had been
selected, apparently this was not feasible under the circumstances.  Tr.
I at 115-118.  As discussed below, however, the expected effect of
collapsing the strata favored the State.

22.   As a result, we do not address the validity of OCSE Methodology #3
here.  This does not, however, undercut our decisions in Ohio and New
Mexico where OCSE had relied on Methodology #3.  In those cases, we
found that the States' evidence was insufficient to rebut OCSE's
evidence about the validity of the method and that, in any event, the
States had not shown that any further refinements would change the
ultimate results.  We also found that further refinements were unlikely
to change the ultimate results given the raw sample data.  We note that,
despite the State's expert's implication that adjustments or a more
precise alternative to OCSE Methodology #3 would favor the States, the
separate ratio estimator method actually resulted in a higher upper
limit for Mississippi than the combined ratio estimator method.

23.   The State's expert originally determined that the combined ratio
estimator was not appropriate here because the ratios were not constant
from stratum to stratum.  Tr. I at 81-82.  OCSE's expert stated a strong
opinion that this was erroneous.  Tr. I at 114-115.  The State's expert
ultimately dropped this objection.

24.   Even the State's expert described the efficiency rate (48.9%) as
the "best guess of the true efficiency rate."  Tr. II at 10.  While he
originally responded to OCSE's expert's presentation of OCSE Methodology
#4 by indicating that there might be a question about whether the
estimate was biased (but noting that OCSE's expert had determined it was
not), the State's expert did not indicate any concern about this after
further consideration.  See Tr. I at 195; see also Tr. I at 161.  OCSE's
expert said the estimator was biased 5% in favor of the State.  Tr. II
at 161.

25.   If we agreed with the State's expert that the sample size was too
small here, the appropriate result would be to remand to OCSE to
increase the sample size (and therefore the reliability of the results).
Yet, OCSE's expert testified that it would take a "considerable increase
in the sample size" to get a better estimate and that he did not see how
this would benefit the State, given that an increase would both reduce
the length of the confidence interval and the bias of the estimate
(which he said was 5% in favor of the State).  Tr. I at 160-161.  The
State's expert did not state an opinion to the contrary and acknowledged
that the smaller the sample size the greater the confidence interval.
Tr. I at 52.  While the State said it was willing to bear the burden of
an increase in its costs for OCSE's review of an increased sample size,
we think that a remand here would be a total waste of time and money for
both parties.

26.   Before the hearing, we suggested that the State had a burden to
show that its alleged "correct analysis" would make a difference here.
The State questioned whether it should have this burden.  We need not
decide, however, whether the State had to meet such a burden, since we
find that, even if OCSE properly had the burden to establish the
reliability of its results, it clearly met that burden