@afkchris
bro the AI watching my screen recordings is gonna learn that 80% of my job is alt-tabbing away from twitter
AI labs can train models from screen recordings to replicate jobs. Analysis: 53.9% supportive, 27.1% opposing — rising concern and urgency across industries.
Real-time analysis of public opinion and engagement
What the community is saying — both sides
They repeatedly emphasize that making work visible makes it economically replaceable.
A large contingent insists this trend is already accelerating — “it’s here,” “coming fast,” and enterprise pilots are predicted soon; several cite RPA, Microsoft tooling, and Goldman Sachs projections as evidence that behavioral cloning from screen recordings is moving from demo to production.
people call screen recording capture the real threat to corporate IP and worker agency, warning that employees may unintentionally train their own replacements and that companies can harvest workflows without compensation.
Others point out practical limits and engineering friction — integrating recordings with tools, auth, edge cases, and long‑running reliability are nontrivial problems that separate demos from robust production agents.
tasks that are routine, repetitive, or UI‑driven are easiest to clone, while judgment, context, creativity, and high‑stakes decision‑making remain harder to reproduce and thus more durable.
adapt or be automated. Advice includes reskilling, learning to build or deploy AI, focusing on cross‑domain judgment, entrepreneurship, and work that’s hard to render into a recording.
people ask what unions, governments, and regulation can or should do, and worry there will be no compensation for worker‑generated training data unless policy changes.
Reactions are mixed between dread and opportunism — some celebrate massive productivity gains and new product opportunities, while others fret about demand collapse if consumers become jobless.
popular quips about alt‑tabbing, fake work, and robots replacing mundane office rituals underscore anxiety but also diffuse it with gallows humor.
work is reframed as transferable processes rather than unique skill, so the competitive edge will go to those who design how AI amplifies human judgment rather than those who merely execute routine tasks.
A loud chorus of skepticism argues that watching screens can’t capture intuition, context, or the unseen steps that make many jobs work—readers repeatedly note that decision processes, off‑screen thinking, and relationship work won’t be learned from recordings.
messy data, daily novelty, edge cases and changing conditions mean agents trained on recordings would fail in unpredictable situations.
commenters cite liability and reliability concerns, the need for human checks, and domain failures (math errors, debugging, test proctoring) as barriers to full automation.
Several replies point to domain specificity — emergency care, sales, mental‑health leadership, skilled trades and client relationships — as examples where human judgment and EQ are essential.
multiple replies challenge proponents to show working demos rather than hypotheticals.
some say AI can be a booster for repetitive tasks, but not a wholesale replacement for complex jobs, requiring human oversight.
Tone ranges from jokey relief about personal job security to blunt dismissal of the idea as hype or grift; there’s also resentment from practitioners who feel the proposal misunderstands how real work happens.
Technical critics flag model and compute limits — LLM shortcomings, noisy datasets, and the need for specialized acceleration — arguing these make the leap from demo to deployment far from trivial.
Most popular replies, ranked by engagement
bro the AI watching my screen recordings is gonna learn that 80% of my job is alt-tabbing away from twitter
this assumes most jobs are learnable from watching half of what makes someone good is the stuff that doesn't show on screen
We’ll see about that. The idea that these models are going to direct themselves seem quite pie in the sky to me.
They've been doing this for years. It's called RPA (Robotic Process Automation). The difference is now the model can handle the edge cases where the UI changed slightly. The fragility is gone.
ce in jobs & human activities & AI fucks up. It won't be replacing humans ever. It will just be another tool. Contributing to the community with your labor is one of our primary functions & brings us great joy. What the fuck else are we going to do? Like, why exist at all?
once they start mapping our mouse movements and idle time it will learn all the shortcuts we use to look busy too.