
While
wanting
to
know
the
guest
list
is
a
reasonable
urge,
going
far
enough
to
actually
ask
is
usually
a
set
up
for
social
embarrassment.
Asking
who
got
invited
to
the
destination
wedding?
You
don’t
know
because
you
weren’t
invited
and
either
aren’t
as
bright
a
star
in
the
wedded-to-be’s
universe
as
you
thought
or
you
give
off
broke
vibes.
You
start
asking
the
bouncer
who’s
on
the
invite
list
and
why
it
doesn’t
include
you?
You’re
vampric
sketch
comedy.
But
the
lowest
example
of
list
inquiry
has
to
be
asking
a
mourning
family
who
all
showed
up
to
their
dead
son’s
memorial
service.
Tech
Buzz
has
coverage:
OpenAI
has
asked
the
family
of
16-year-old
Adam
Raine
for
a
complete
list
of
attendees
from
their
son’s
memorial
service,
escalating
tensions
in
the
wrongful
death
lawsuit
that
alleges
ChatGPT
conversations
led
to
the
teen’s
suicide.
The
discovery
request,
which
family
lawyers
call
“intentional
harassment,”
comes
as
the
Raine
family
updated
their
lawsuit
with
explosive
new
allegations
about
safety
shortcuts.
Ah
yes,
who
among
us
hasn’t
tabulated
funeral
attendees
and
compared
that
list
with
our
expectations
like
an
insecure
twenty-something
checking
to
see
if
every
person
that
RSVP’d
on
Partiful
actually
walks
through
the
door.
OpenAI
isn’t
stopping
at
the
who’s
who
mind
you
—
they
also
want
videos,
photographs,
and
eulogies.
If
there
ends
up
being
an
“Asshole”
special
mention
to
our
eventual
Lawyer
of
the
Year
contest,
this
is
going
to
be
a
pretty
hard
candidate
to
beat.
This
isn’t
to
say
that
there
is
no
coherent
legal
rationale
for
why
OpenAI
would
do
something
like
this.
To
the
degree
that
the
suit
hinges
on
the
claim
that
GPT-4
isolated
Raine
from
support
networks
like
friends
and
family,
it
would
make
sense
to
contact
the
people
that
were
close
to
him.
But
is
a
memorial
service
really
a
proper
site
of
discovery?
There
are
other
ways
to
determine
degrees
of
isolation,
and
probe
whether
anyone
could
have
intervened.
I
know
the
days
of
a
MySpace
top
5
are
long
gone,
but
it
would
have
been
a
lot
less
intrusive
to
cull
information
from
his
social
media
use.
Legality
aside,
this
is
a
bad
look
for
a
company
valued
at
$500B.
Makes
you
wonder,
why
didn’t
they
just
push
to
settle
this
away
from
the
public
eye
instead
of
pursuing
strategies
that
would
make
a
shameless
man
find
some?
Did
they
run
the
numbers
and
figure
that
the
number
of
AI
related
mishaps
and
suicide
will
such
a
big
exposure
that
they
needed
an
aggressive
legal
strategy
on
the
books
to
convince
people
not
even
to
bother
to
sue?
Maybe
they
just
asked
Chat-GPT
5
what
their
legal
strategy
should
be
and
they’re
going
along
with
whatever
it
spat
out.
Either
way,
if
the
ecological
consequences
to
AI
use
or
the
absolutely
bonkers
AI
bubble
threatening
to
pop
at
any
moment
aren’t
big
enough
information
hazards
to
encourage
you
to
scale
back
your
dependence
on
whatever
iteration
of
ChatGPT
promises
to
be
your
friend
or
help
you
with
your
troubles,
maybe
knowing
that
the
company
would
depose
whoever
goes
to
your
funeral
will
factor
in
how
reliant
you
are
on
the
tech.
OpenAI
Demands
Memorial
Attendee
List
In
Teen
Suicide
Lawsuit
[Tech
Buzz]
Earlier:
ChatGPT
Suicide
Suit:
How
Can
The
Law
Assign
Liability
For
AI
Tragedy?
Chris
Williams
became
a
social
media
manager
and
assistant
editor
for
Above
the
Law
in
June
2021.
Prior
to
joining
the
staff,
he
moonlighted
as
a
minor
Memelord™
in
the
Facebook
group Law
School
Memes
for
Edgy
T14s
.
He
endured
Missouri
long
enough
to
graduate
from
Washington
University
in
St.
Louis
School
of
Law.
He
is
a
former
boatbuilder
who
is
learning
to
swim, is
interested
in
critical
race
theory,
philosophy,
and
humor,
and
has
a
love
for
cycling
that
occasionally
annoys
his
peers.
You
can
reach
him
by
email
at [email protected]
and
by
tweet
at @WritesForRent.
