Countdown to Sixty – Part 2: The Hardest Obstacle to Decent Security? Human Stupidity.

🎂🧠🎸 Countdown to Sixty – Part 2 is live 🧯🛰️

Generazione immagine completata


Birthday countdown update: T-minus a few days until I officially qualify as vintage hardware.

I need to publish this before age hits the Delete key on my short-term memory.

Today’s topic?

The single toughest obstacle to decent security: human stupidity—not “not knowing”, but not knowing while loudly pretending you do.

In forty years I’ve watched Web 2.0 deliver a cultural revolution (not a technical one): everyone became a publisher, the algorithm handed out megaphones, and confident error went viral. Add developers racing deadlines (hello COM+ era turning off security to ship faster, and the .NET garbage collector presented as if it hadn’t existed since dinosaurs wrote Lisp), and you get a perfect storm.

We’ll wander through real revolutions vs rebrands (XDR/SASE/AI-wash, pick your flavour), budgets that migrate like birds (huge after a breach, extinct by Q2), and the unfashionable truths that actually work: tested backups, email continuity, least privilege, and knowing where the data is. There’s even a cameo from policy theatre—because, as I keep saying, DNS outruns decrees (ciao, PiracyShield).

Language inspirations remain unapologetically British: a wink from Jerome K. Jerome, the absurd logic of Douglas Adams, and a tiny Monty Python coconut to bonk nonsense on the head.

If Part 1 made you smile, Part 2 will make you nod, wince, and possibly check your backups.

Countdown to Sixty – Part 2: The Hardest Obstacle to Decent Security? Human Stupidity.

Antonio Ieranò, #OPEN_TO_WORK

Antonio Ieranò 

Security, Data Protection, Privacy. Comments are on my own unique responsibility 🙂

August 22, 2025

Okay, part 2 means yet three rants till my birthday. I promise, then I will turn normal, well as normal as I can be.

Language inspirations (blame them, not my inbox): a wink from Jerome K. Jerome for the gently exasperated eyebrow, a generous splash of Douglas Adams for logical absurdities that turn out to be true, and a small Monty Python coconut to bonk nonsense on the head. If Part 1 made you smile (or made a PR department reach for a sedative), perfect. We pick up right where we left off.

Prologue

Work is a big part of our lives, and I could not avoid doing some reflections on my working life at the edge of my 60th year, isn’t it? So I will go directly to the point: what the first things i think of when i think about my working experience? the answer is “stupidity.”

let see what and where

What is stupidity, really?

Let’s not be delicate. Stupidity isn’t “not knowing.” We’re all allowed not to know. Stupidity is not knowing while failing to realise you don’t know, and then pretending you do—decisions made on blind faith, fashion, the last vendor demo, or the loudest person in the room. I’ve worked in this business long enough to watch digitalisation soak into every pore of work and life and, with it, a sort of industrialised confusion. Tools improved; understanding often didn’t. Budgets behave like migratory birds: dramatic flocks appear just after an incident; a month later the board’s goldfish memory resets, a scapegoat is found, and the flock flies south. Cue the classic line: “It was an unexpected, incredibly sophisticated attack.” Which usually translates to: “We didn’t spend a penny on working backup and restore, but look, lasers!”

This is a discursive stroll—forty years of information technology and security, and the recurring villain wearing a human face. Names omitted to protect the flagrantly guilty (and my comment section).


Four decades of kit: from beige boxes to hyperscale (and back to very small things)

In the late 1980s and early 1990s, personal computers escaped the lab and multiplied like rabbits. Ethernet won its bar fights; “backup” meant shoving a tape into a hungry drive and praying with the sincerity of a saint who has seen things. I once taught an IBM course on migrating from OS/2 to Windows (c. 1993–1996 for the OS/2→Windows tide). One gentleman—two days from retirement—slept through the entire class with the serene confidence of a Zen monk. His snoring punctuated my slides at perfect intervals, a living metaphor for the human capacity to nap through revolutions.

By 1995–2005, x86 had eaten the data centre; Wi-Fi arrived; mobiles began colonising our attention one chirp at a time. From 2005–2015, we turned servers into Russian dolls: virtualisation on pizza boxes, SSDs making patience obsolete, laptops outnumbering desktops like caffeinated squirrels. Then, 2015–2025: we put the data centre on stilts and called it a cloud; we containerised everything (and occasionally forgot what was inside the boxes); ARM turned up in places we didn’t expect; and the “perimeter” put on a backpack and went travelling.

Each decade promised the hardware would save us. It didn’t. It merely saved time—time we immediately used to click on worse things, faster.


Software grew up, learned to talk back, and then learned to shout

We left the mainframe pews for client/server in the late 1980s–mid-1990s and discovered patching as a spiritual discipline. The mid-1990s–early 2000s gave us web applications—CGI, ASP, JSP—where HTML pages began to act like software while input validation remained a myth told to frighten junior developers.

Then came Web 2.0 (roughly 2004–2012), which—let’s be honest—was not a technological revolution so much as a cultural one. The technology was still HTML, HTTP and friends; the revolution was that users became publishers. First blogs (think 1999–2003: the golden age of Blogger/early WordPress), then social networks (2003–2008: Friendster/MySpace/Facebook/Twitter et al.) turned up with a mirror and said: “Darling, the world needs to hear your thoughts now, hourly.” Participation was the upside; the algorithmic megaphone was the downside. We built a system where outrage outperforms nuance and certainty grows in direct proportion to ignorance. The need to show up swamped the need to know anything. It nourished the clueless conspiracy nonsense we see today and fed the radicalisation of positions—including in tech. We now have tribes arguing about architectures as if they were football teams. The fool does not know they believe; the fool simply believes louder.

From 2008–2016, app stores replaced the desktop and your permissions became a confession booth. From 2006→today, SaaS quietly turned your business model into a direct debit with cheerful dashboards. The 2010s gave us DevOps (walls down, speed up) and DevSecOps (staple security into the pipeline). Promises to “shift left” often took a right turn into the backlog. Meanwhile AI/ML—old maths on new GPUs—oscillated between life-saving pattern detection and numerology with better charts.

Our collective amnesia did the rest. We declared each cycle novel and thereby kept shipping old bugs with new stickers.


Developers, deadlines and the security that “will be added later”

This part is awkward because I like developers. I was one. But I’ve also watched, for decades, how shipping code on time routinely won over shipping code safely.

Consider COM/DCOM in the mid-1990s and COM+ around 1999–2000 (Windows 2000 era). Role-based security, declarative transactions, even access checks—existed. And yet, time and again, projects “temporarily” disabled or bypassed those controls for speed. “We’ll harden later.” Later rarely arrived. We all have scars from that period.

Then .NET was announced circa 2000, first framework release 2002. I still remember people applauding the garbage collector as if it were a brand-new miracle. It made me smile. I’m old enough to know what a garbage collector is and to remember its lineage: Lisp had GC in the late 1950s/early 1960s, Smalltalk in the 1970s, Java shipped with GC in 1995. .NET’s GC was welcome—but not an innovation. This is the pattern: security features and sound engineering practices arrive, and then schedules arrive faster; the first thing to go is the thing no one sees in the demo. The productivity theatre wins; the attack surface applauds.

If you want the one-sentence summary of this chapter: we kept building faster and calling it better, while moving “security” to the post-credits scene.


Security services: the never-ending remix (with better fonts)

If you’ve lived the whole arc, you’ve seen this film. Antivirus began in the late 1980s with signatures, grew into EPP in the mid-2010s, blossomed into EDR in the early-to-mid 2010s, and then became XDR in the late 2010s, adding integrations, context and confetti. Behavioural analysis and heuristics? Present since the 1990s—long before they were marketed as “AI.”

Network eyes moved from IDS (mid-1990s) to IPS (early 2000s) and now parade as NDR (late 2010s→today). Same curiosity about packets; better lighting. Firewalls, born in the early 1990s as packet/stateful devices, were baptised NGFW in the late 2000s; they didn’t ascend to heaven, they learned to speak “application.” I had my own cameo: at Symantec I became Product Manager for a firewall line; three days later it was retired. Not my fault. The shelf life of my product was shorter than yoghurt.

On the web front, we layered proxies and secure web gateways in the early 2000s, discovered CASB in the mid-2010s, then rolled it all into SSE/SASE by the late 2010s—a pragmatic admission that your perimeter commutes between coffee shops. Email security matured from late-1990s/2000s gateways to 2010s advanced detection… and then the penny dropped: continuity. The late 2010s–2020s taught us that clouds faint; you’d better have oxygen. I repeat this often in my writing because reality keeps agreeing with me: the inbox is the company’s aorta. When the platform naps, Emergency Inbox keeps patients alive; when it doesn’t exist, executives rediscover the ancient art of announcing outages by smoke signal.

In the cloud terrarium, the late 2010s gave us CSPM/CWPP/CIEM; the early 2020s introduced CNAPP (umbrella) and DSPM/DSPA to finally shine a light on data sprawl. The acronyms changed; the need to know what you have and who can touch it did not. And yes, “Next-Gen” often just means the 2009 feature has finally been turned on.


The alphabet that bites back: laws and standards that help—and those that perform

Governance wasn’t invented by webinars. BS 7799 in the mid-1990s became ISO/IEC 27001/27002 in 2005, refreshed 2013 and 2022. PCI DSS has been reminding merchants to have a relationship with reality since 2004. America, with HIPAA (1996), GLBA (1999) and SOX (2002), discovered regulation as a mood. The NIST CSF arrived 2014 (v1.0), clarified 2018 (v1.1), and became actually readable by normal humans in 2024 (v2.0).

Then the EU delivered a bouquet: GDPR (adopted 2016, enforced 2018) so data is no longer a souvenir; NIS (2016/2018) and NIS2 (2022/2024 transposition) so operators stop pretending; DORA (2022/2025 applicability) so finance learns to spell resilience; the AI Act (2024) to fence the model zoo; and CRA plus eIDAS 2.0 (2024–) to remind products and trust services that grown-ups exist.

And then there is the theatre. PiracyShield (2023–2024) is a cautionary tale: when policy cosplays as security, DNS outruns decrees, rights faceplant, and engineers develop migraines. If you’ve read my pieces, you know the refrain: writing “security” on a hat doesn’t make the hat secure.

Every five years we “rediscover governance” like it’s a new comet. It isn’t. It’s the same comet, only the press release changed.


The social megaphone and the age of confident error

Let me underline it: Web 2.0 was a cultural revolution, not a technical one. The hardware and protocols were essentially familiar; the novelty was who got to publish. First blogs, then social, then everyone’s dopamine. The need to show up eclipsed the need to have anything useful to say. We created a world in which conspiracy flourishes precisely because it is confidently expressed. In technology, this bred absolutism: microservices vs monoliths, on-prem vs cloud, “Zero Trust or bust,” all delivered with the zeal of a sect. Meanwhile, the dull things that actually prevent disasters—patching, least privilege, tested backups—receive the sort of attention usually reserved for tax forms.


Revolutions that were real—and revolutions that were rebranded

A little honesty goes a long way. Zero Trust is a healthy narrative, but its bones are vintage: least privilege (1970s), strong identity, segmentation. XDR is mostly EDR with integrations, choreography and a better slide deck. SASE/SSE is SD-WAN plus cloud controls, a rational consolidation wrapped in brochure poetry. AI in security sometimes saves the day and sometimes is a washing label. DSPM and CNAPP finally put names to data and cloud posture we were already flailing at.

There were genuine shifts. Public-key crypto at scale and TLS-everywhere changed the street. Virtualisation and containers created utilisation economics and agility we couldn’t dream of. Smartphones dragged the threat model into your pocket. Identity as the perimeter is the most inconvenient truth of the last decade. And telemetry + automation, in the hands of adults, shorten incidents instead of lengthening status meetings.


The prime adversary remains the same: people

We share passwords like sweets, leave default credentials as if they were heirlooms, and promise patching “right after quarter-end,” by which time the exploit has retired to the seaside. Phishing remains the art of persuading a perfectly nice person to press the bright red button labelled DO NOT CLICK. Shadow IT is how your most enthusiastic colleague installs a free VPN called ShadyBear (five stars on TikTok). MFA fatigue becomes a tragic rhythm: approve, approve, approve… oh. Backups exist that never restore, which is performance art, not resilience. Some organisations laminate policies and call it security. Others deny outages by emailing about them—when the email system is down.

I’ve worked with minds so sharp they could subnet blindfolded—and with managers who sincerely thought IP stood for Instant Pizza. Both got promoted. The universe maintains balance with a wicked sense of humour.


The “new wave will save us” delusion

Every cycle promises redemption. Web 2.0 would fix knowledge; it amplified noise. Cloud would make disaster recovery obsolete; it made continuity non-negotiable. AI would end phishing; it now authors phishing in your house style. Regulation would solve security; it helps until it becomes theatre. Zero Trust would end breaches; it reduces blast radius if someone actually reads the playbook.

And yes, the boring miracle remains: working, tested backups. They are cheaper than regrets and, unlike press releases, restore data.

A travel metaphor to illustrate mindset: during a period when I was airborne about eighty percent of the time, one employer reimbursed laundry worldwide—except underwear. Try explaining that to your suitcase. Many security programmes feel similar: frameworks that cover everything except the obvious thing you’ll need on day one.


Budgets and the blame carousel

The budget arc plays out like farce. Quiet quarter? “Why so much spend on security—trim by thirty percent.” Incident? Sirens, headlines, late-night pizza, heroic posts. Suddenly budgets monsoon in; tools multiply; training does not. Three months later, we’re told it was an “unexpected, incredibly sophisticated attack,” and the CFO wonders aloud why we’re paying for email continuity when the provider’s SLA says 99.9%. Someone junior is volunteered as a learning opportunity. Budgets return to diet mode.

If you remember one sentence, take this: culture eats budget. If leadership doesn’t understand what it’s buying, spend will swing wildly and still miss the target.


Hardware ghosts, software oops, and services that actually help

Underneath the shiny, the ghosts remain. Firmware and supply chains live where scanners fear to tread. Wi-Fi still turns offices into hospitable surfaces for strangers if it’s misconfigured—and it often is. IoT and BYOD mean everything computes and nothing updates. Air-gapping exists as a lifestyle; occasionally it helps, often it performs.

In software, we no longer write so much as assemble strangers’ libraries and call it a day. SBOMs are our attempt to remember who we invited to dinner. Secrets roam public repos like tourists without maps. CI/CD ships fixes—and mistakes—at speed. Data multiplies like tribbles, then someone asks: where is the PII? At which point DLP, DSPM and their cousins appear, late but necessary.

Services work when we do. Email security plus continuity is the difference between a company that breathes during outages and one that communicates via smoke signals. EDR/XDR/NDR are powerful, but telemetry without context is expensive trivia. Identity-first security—IdP, MFA, risk-based auth—succeeds exactly as far as people resist sharing admin credentials. Takedown and brand protection remain the practical answer to malicious domains: don’t litigate a fire; put it out, then sue. As I like to repeat: DNS outruns lawyers every day that ends in “y.”


The economics of stupidity

Security is applied economics with better acronyms. At the micro level, one bad click ruins everyone’s weekend. At the meso level, shared suppliers mean shared blast radius. At the macro level, regulation should seek resilience, not headlines. One way or another, you pay. You either pay for controls, or you pay for consequences. Only one of those invoices arrives on your schedule.


What actually works (and why it’s unfashionable)

The effective list remains gloriously dull: least privilege and segmentation; patch management that happens before ransomware, not after; email continuity and an off-path communication channel; backups that restore because someone checked; actual asset and data inventories; runbooks so incidents aren’t improv comedy; a takedown muscle that contains first and litigates later. These things are rarely glamorous. They also keep your weekends.


Serious, never solemn

If Part 1 set the tone—serious but not solemn, smiles over yawns—Part 2 names the villain. Technology advanced. Understanding lagged. Stupidity scaled—and Web 2.0 handed it a megaphone. Developers raced; security jogged behind carrying the water bottles. The fool doesn’t know they believe; the fool simply believes louder. Your job, if you lead, is to replace belief with understanding before the incident replaces your weekend.

In Part 3, I’ll talk about people—the geniuses, the charlatans, and the ones who sincerely thought IP meant Instant Pizza. I’ve met all three. Sometimes on the same call.


Discover more from The Puchi Herald Magazine

Subscribe to get the latest posts sent to your email.