It was 40 years ago today that my cousin Anthony, only a few
months older than me, tragically passed away from complications he suffered in
a terrible car accident on New Year’s Day, 1986. Having two younger sisters and
no brothers, Anthony was the closest thing I had to a brother in my life.
Despite my own early talent with computers, his talent in that area always
exceeded mine. I can only imagine how much he would have accomplished had his
life not been cruelly cut short. This writing is dedicated to his memory on
this anniversary of his passing.
I live in a strange kind of love, hate relationship with
technology, and at this point it basically defines who I am as a professional.
Everything I do, as an information governance consultant,
and in my past roles as legal counsel, comes down to risk. Not just the obvious
“Are we going to get sued?” kind of risk, but the subtler forms, regulatory
exposure, reputational damage, operational disruption, even the risk of
misunderstanding complex technology in front of a judge who is still wrestling
with email and metadata, let alone qubits. For a long time I used to say my
work was about “cost and risk,” but the more I dug in, the more I realized cost
is just another risk vector. It is one more variable in a massive equation
where the unknowns keep multiplying.
My love story with computers started early. In the early
1980s, when most people still treated school computers like mysterious, fragile
boxes, I was a high school kid in South Brunswick teaching the teachers how to
use theirs. I loved the feeling that this little machine could do so much more
than people expected, faster, cleaner, more elegantly. Back then, the worst-case
scenarios were losing a file, crashing a program, or causing a power outage
(which I did once… oops). Now, that same category of machine sits at the center
of cybersecurity incidents, data breaches, cross border data fights, and messy
litigation that can turn on a single mismanaged email.
Today, computers are not background tools in my world, they
are crime scenes, key witnesses, and sometimes co-conspirators. When I discuss
cybersecurity, eDiscovery, internal investigations, data breach audits, FOIA
requests, and social media governance, I am really talking about the same thing
from different angles, how much risk is hidden in all this data, and how we
keep it from exploding at the worst possible time. Every decision about data,
where it lives, how long it is kept, who can touch it, how it is secured,
carries legal consequences. I might be helping a client with M&A due
diligence one day, worrying about data migration and legacy systems, and the
next day focused on remote data collection, self-collection pitfalls, or
redaction failures that could accidentally expose privileged or sensitive
information.
The love part is that I genuinely enjoy this work. In addition,
the people that I have had the good fortune of meeting within the legal technology
community are among the finest I’ve ever met. Some of my industry colleagues
have become like extended family to me over the past 20 years where I have
focused primarily on technology. I also like the intellectual challenge of
untangling a messy data lake, designing sane retention policies, or wrestling
with multilingual data sets where one document can contain multiple languages
and nested links to other files. I like translating all of this into plain
language for judges, executives, or regulators who do not live in the weeds.
There is real satisfaction in helping someone see that information governance,
data protection, and legal operations are not just overhead, they are survival
skills.
But there is a dark edge to that love, and that is where the
hate comes in. Technology has given us tools like generative AI that can draft,
summarize, and search in ways that were nearly unthinkable a few years ago. It
is exhilarating to explore generative AI solutions for legal practice, to think
about how they can help lawyers be more efficient, more creative, and more
informed. However, the very same tools hallucinate cases out of thin air,
create deepfakes that can contaminate evidentiary records, and encourage people
to place blind trust in systems they barely understand. When you work at the
intersection of technology and law, you do not just see “cool new tech,” you
see new failure modes, malpractice, sanctions, regulatory fines, and a long
tail of unintended consequences.
Quantum computing is the next chapter in this story, and it
might be the most extreme version of the love, hate dynamic. The concepts that
drive it, superposition, uncertainty, entanglement, decoherence, are
fascinating to me. I can happily nerd out about qubits and qudits, about how
these strange quantum states can represent and process information in ways that
make classical bits look quaint. I took AP Physics in High-School and although
I was a Political Science Major in College, at NYU, I have always been
fascinated by quantum mechanics.
The upside is huge, breakthroughs in medicine, optimization,
materials science, and more. But I can also see the legal and governance
nightmare waiting in the wings. How do you explain superposition and
entanglement to a judge in a way that matters for causation, reliability, or
admissibility? How do you talk about decoherence and still claim you have a
stable, reproducible process behind a key piece of evidence?
And then there is the risk that quantum computing blows up
our existing security assumptions. The encryption that underpins global
finance, government communications, and everyday privacy is not guaranteed to
survive contact with sufficiently powerful quantum machines. That is why post
quantum encryption and “quantum governance” are not just buzzwords to me, they
are the early lines of defense in a world where attackers can “harvest now,
decrypt later.” The upside of quantum is enormous, but so is the downside, it
has the same dual use character as generative AI, only with potentially deeper
structural consequences.
When you pile all of this together, cybersecurity, data
protection, data transfers, data privacy, records management, internal
investigations, computer forensics, AI governance, quantum governance, social
media governance, body and drone cameras, self driving vehicles, AI robotics,
multilingual data, hyperlinked file structures, the pattern becomes clear. My
work is about living in the middle of that complexity and trying to keep the
whole system from spiraling into chaos. Every project is a balancing act. Collect
too much data and you increase cost, privacy exposure, and breach risk, collect
too little and you are facing sanctions or an incomplete factual record. Redact
too aggressively and you look evasive, redact too lightly and you leak
something you can never take back.
So when I say I love and hate technology, what I really mean
is that I love what it lets us do and I hate how easy it is to get hurt by it.
I love the creativity, the speed, the power, and the sheer intellectual puzzle
of it all. I hate the fact that every new solution seems to generate a new
class of challenges, that every innovation comes with a fine print of new risks
that legal, compliance, and governance professionals have to parse and manage.
Technology is both my toolset and my adversary, my career and my cautionary
tale. Risks related to the use of technology have cost me most of my hair, and
what is left is rapidly turning gray. I am not complaining, I have chosen to
stand right at that fault line, trying to make sense of it for clients, for
courts, and, frankly, for myself. I certainly wish my late cousin were still
here with us to help me grapple with all of these technology related risks.
May my cousin Anthony Butera (1966–1986) rest in peace and
thank you for reading this narrative.