The law firm of choice for internationally focused companies

+263 242 744 677

admin@tsazim.com

4 Gunhill Avenue,

Harare, Zimbabwe

Proposed Evidentiary Rule 707: Addressing A Nonexistent Problem Instead Of Real Ones – Above the Law

Back
in
May
2025,
the
U.S.

Judicial
Advisory
Committee
on
Evidence
Rules

proposed
 a
new
draft
Rule
to
the
Federal
Rules
of
Evidence.
The
Rule
purportedly
would
regulate
how
AI-generated
evidence
is
to
be
introduced
at
federal
trials.
It
opened
the
Rule
up
for
comment
and
on
January
15,
2026,
a
hearing
was
held.
Suffice
it
to
say
there
was
not
much
support

for
once,
plaintiffs
class
action
lawyers
and
in-house
legal
counsel
both
objected.
And
for
good
reason:
the
proposed
rule
addresses
a
problem
that’s
not
a
problem
while
ignoring
the
very
real
challenges
facing
the
judiciary.


The
Proposed
Rule

The
draft
Rule
in
question,

Rule
707
,
states
as
follows:

When
machine-generated
evidence
is
offered
without
an
expert
witness
and
would
be
subject
to
Rule
702
if
testified
to
by
a
witness,
the
court
may
admit
the
evidence
only
if
it
satisfies
the
requirements
of
Rule
702(a)-(d).
This
rule
does
not
apply
to
the
output
of
simple
scientific
instruments.


Rule
702

deals
with
the
standards
by
which
expert
testimony
and
evidence
is
to
be
determined
to
be
admissible.
Generally
speaking,
the
evidence
must
be
helpful
to
the
trier
of
fact,
be
based
on
reliable
principles,
and
constitute
a
reliable
application
of
those
principles.
It
grew
out
of
a

Supreme
Court
ruling

in
1993
in

Daubert
v.
Merrell

which
set
standards
for
how
courts
are
to
assess
expert
testimony
admissibility.
The
result
is
frequent
hearings
challenging
the
admissibility
of
expert
testimony
in
court.

According
to
the

Committee
Notes

on
the
proposed
Rule,
the
idea
is
that
the
same
standards
for
admissibility
should
apply
to
“machine-generated
evidence”
offered
by
a
layperson
as
to
that
offered
by
an
expert.
The
Committee
reasoned
that
a
lay
witness
might
apply
a
“program”
but
know
little
or
nothing
about
its
reliability.
The
Committee
seemed
to
be
thinking
about
a
technician
who
would
enter
a
question
and
then
print
out
the
answer
and
then
testify
about
what
they
found.

It
appears
that
this
Rule
is
one
of
the
first
efforts
to
deal
with
presumably
AI-generated
evidence.


Rule
707
Ignores
the
Real
Issues

What
is
alarming
is
that
the
Rules
Committee
is
focusing
on
something
that
is
not
that
big
of
a
problem
particularly
for
experienced
lawyers
and
judges
when
there
are
plenty
of
bigger
issues
facing
the
judiciary.
Like
the
proliferation
of
deepfakes
about
which
I
have

written
several
times
.
Like
the
threats
of
AI
to
judicial
decision
making
and
how
judges
should
be
using
it,
also
a
subject
I
and
others

have
addressed
.
Like
the
cost
of
litigation
that
freezes
so
many
people
out
of
courtrooms.
Like
bias
in
the
AI
tools
that
are
being
used
every
day.
Like
the
lack
of
technological
training
and
understanding
many
judges
desperately
need
to
conduct
fair
trials.
Like
access
to
justice.
Like
threats
to
the
rule
of
law
itself.

If
you
want
a
good
understanding
to
the
threats
to
the
judiciary
and
what
can
be
done
about,
I
suggest
you
read
the
comments
of
Dr.
Maura
Grossman,
Hon.
Herbert
Dixon,
Hon.
Allison
Goddard,
Hon.
Xavier
Rodriguez,
Hon.
Scott
Schlegel,
and
Hon.
Samuel
Thumma
in
the
section
entitled
AI
and
the
Courts
in
the
recent

ABA
Report
on
the
Impact
of
AI
on
the
Practice
of
Law
.
I
know
most
of
these
judges
and
experts:
they
are
truly
the
cutting-edge
thinkers
on
the
state
of
the
judiciary
today.

The
proposed
Rule
focuses
on
none
of
these
things.
Nor
have
any
other
proposed
Rules.


Rule
707
Ignores
Courtroom
Practicalities

Beyond
ignoring
serious
problems,
Rule
707
creates
new
ones
and
ignores
practicalities.

First
and
foremost,
what
the
hell
is
machine-generated
evidence?
A
calculator
is
a
machine
and
can
generate
numbers
used
as
evidence.
So
can
a
spreadsheet
on
a
computer.
So
can
a
lot
of
“machines”
we
use
in
daily
life.
The
Committee
tries
to
save
itself
by
adding
the
last
sentence
which
provides
that
the
Rule
won’t
apply
to
simple
scientific
instruments.
But
again,
what
does
that
mean?
What
is
considered
simple?
For
that
matter,
what
is
an
instrument?
Not
to
mention
the
fact
that
what
is
considered
a
simple
instrument
today
once
upon
a
time
was
far
from
it.

Certainly,
the
rule
is
aimed
at
GenAI.
But
the
Committee
ought
to
just
say
that
and
provide
a
definition.
Otherwise,
it
just
muddies
the
water.

Another
faulty
presumption:
experts
who
testify
have
more
knowledge
about
the
reality
of
machine-generated
evidence
than
anyone
else.
The
truth
is
some
experts
may
have
knowledge
of
this
kind
of
evidence
(whatever
it
is)
but
many
don’t.
An
accident
reconstruction
expert
may
know
about
how
to
reconstruct
an
accident
but
not
how
ChatGPT
created
the
drawing
and
image
they
want
to
use.

And
courtroom
practicalities
suggest
that
we
don’t
really
need
this
kind
of
ambiguity.
If
a
lay
witness
wants
to
use
evidence
generated
by
ChatGPT
or
anything
else
for
that
matter,
most
lawyers
request
a
voir
dire
examination
in
front
of
just
the
judge
and,
if
necessary,
would
then
argue
the
witness
is
not
qualified
to
talk
about
or
rely
on
something
they
know
little
about.
Why
get
into
whether
the
evidence
is
machine-generated?

A
lay
witness
is
not
an
expert
and
can’t
render
opinions
on
technical
issues
for
which
they
aren’t
qualified.
We
don’t
need
a
separate
rule.

And
our
trials
(when
we
have
them

which
isn’t
often,
at
least
on
the
civil
side,
as
I

have
discussed
)
are
already
too
costly
and
time-consuming.
Under
this
new
Rule,
we
would
now
have
to
have
a
separate
Rule
702
or
Daubert
type
inquiry
for
lay
testimony
when
admissibility
is
easy
to
resolve.
I’ve
been
in
those
Daubert
fights.
They
are
long
and
often
pointless.

Plus,
it
opens
the
door
to
all
sorts
of
attacks
on
lay
testimony
based
upon
the
claim
the
witness
used
some
sort
of
evidence
that
was
generated
by
a
machine.
More
attacks,
more
time
spent
on
things
that
are
obvious.

Experienced
judges
can
deal
with
admissibility
of
lay-witness
testimony.
They
don’t
need
more
unclear
rules
for
lawyers
to
waste
their
time
arguing
about.

And
if
we
don’t
have
enough
experienced
judges,
then
let’s
try
to
solve
the
real
problem.

Many
similar
sentiments
were
expressed
at
the
January
15th
hearing.


Both
Sides
Question
the
Proposed
Rule

According
to
a

Reuters
article

reporting
on
the
hearing,
those
attending
generally
made
many
of
the
same
points.
The
interesting
thing
is
that
both
sides
of
the
aisle
seemed
to
question
whether
the
rule
was
a
good
idea
right
now,
particularly
when
technology
is
changing
so
rapidly.
Plaintiffs’
lawyers
worried
that
it
could
increase
cost
especially
in
complex
litigation.
The
in-house
legal
representatives
worried
that
the
rule
would
create
uncertainty
and
was
overly
broad
since
it
could
sweep
in
routine
uses
of
AI
tools.

Both
sides
worried
that
it
would
result
in
more
expert
testimony.
This
is
a
particularly
valid
point.
Experts
are
great
but,
in
my
experience,
there
is
usually
an
expert
somewhere,
someplace
to
opine
on
virtually
anything,
at
the
right
price.
The
result
is
the
process
is
gummed
up
and
juries
are
left
confused
and
uncertain
when
presented
with
diametrically
opposed
opinions
but
two
seemingly
qualified
experts.
We
don’t
need
more
of
that.


Taking
the
Easy
Way
Never
Works

These
are
lots
of
hard
topics
we
need
to
address
as
set
out
above.
Many
may
be
controversial.
But
they
are
real.
Taking
the
easy
way
out
and
offering
Rules
like
707
that
aren’t
needed
seems
like
an
effort
to
appear
to
be
doing
something
while
avoiding
things
we
don’t
want
to
talk
about.

That’s
not
a
solution.
Lawyers
and
judges
should
unite
and
talk
about
real
problems
and
solutions
that
challenge
us
both
if
we
are
interested
in
the
rule
of
law.
Because
the
real
problems
confronting
us
in
the
new
age
of
AI,
if
not
handled
correctly
and
soon,
may
threaten
the
rule
of
law
bedrock
on
which
we
all
depend.






Stephen
Embry
is
a
lawyer,
speaker,
blogger,
and
writer.
He
publishes TechLaw
Crossroads
,
a
blog
devoted
to
the
examination
of
the
tension
between
technology,
the
law,
and
the
practice
of
law