RFC: 15 GCP Engineers vs. a $425M Salesforce Monolith (Nevada DMV Rescue)

I am looking for straight-up feedback. I see a serious gap in Las Vegas, and I see a lot of stupid people in charge, I see a lot of people asleep at the wheel, and I want to know if I’m dreaming in technicolor or whether I should simply move on. Thanks in advance.

The Context: I am a resident of Las Vegas with a front-row seat to a massive public sector failure: The Nevada DMV Transformation Effort (DTE). The project has ballooned from $125M to $425M and the projected deadline has slipped from 2026 to 2029.

The Root Cause: The incumbent integrator (130+ consultants) is attempting to force-fit Salesforce to act as a high-volume, state-level Transaction Processing System (TPS).

  • They are hitting governor limits.

  • The “online portal” is just a file uploader that dumps PDFs into a manual queue (creating a 6-week backlog).

  • The state is paying a “Success Tax” via per-user licensing for every resident that joins the system.

The “Tiger Team” Proposal: I have a “Solicitation Waiver” pathway (NRS 333.150) to propose a rescue pilot. My thesis is simple: High Talent Density > Body Count.

I believe a “Tiger Team” of 15 Senior GCP Engineers (Serverless/Go/Python) can outperform an army of 130 generalist consultants by stripping away the monolith and using consumption-based infrastructure.

The Proposed “Rescue Stack”:

  1. Document AI (Custom Processors + HITL) to digitize 50k monthly mail-in titles.

    • Goal: Reduce processing time from 6 weeks to <5 minutes.

    • Cost: ~$0.03/page vs $50/hour labor.

  2. Compute (The Logic Layer): Cloud Run services (Go/Python) to decouple Vehicle Registration logic from the legacy mainframe.

    • Goal: Scale to zero during lulls; handle end-of-month spikes without “maintenance windows.”
  3. Data (The Accountability Layer): BigQuery for real-time transparency.

    • Goal: Give the Legislature a dashboard they can actually read, bypassing the opaque vendor reports.

My Questions to the Community (and I am no one) I’m simply tired of watching my city burn money. I need the experts to tell me where this breaks.

  1. The “Dirty Data” Reality: Has anyone used Document AI on 1980s-era handwritten government forms? Is 90% automation a fantasy?

  2. Domain Complexity: Can a 15-person team actually rebuild core DMV logic (tax codes, vehicle classes, etc.) in 18 months, or is the domain complexity too high for that headcount?

  3. If you’ve worked in GovTech, what is the specific technical roadblock that kills “modernization” projects like this? (Legacy mainframe integration? Data migration? Security compliance?)

I don’t have any resources other than trusting in this community, for what it’s worth. So your feedback means the world to me.

Thanks,

whoever.