Here in the USA, you have to be afraid for your job these days. Layoffs are rampant everywhere due to outsourcing, and now we have AI on the horizon promising to make things more efficient, but we really know what it is actually going to be used for. They want automate out everything. People packaging up goods for shipping, white collar jobs like analytics, business intelligence, customer service, chat support. Any sort of job that takes a low or moderate amount of effort or intellectual ability is threatened by AI. But once AI takes all these jobs away and shrinks the amount of labor required, what are all these people going to do for work? It’s not like you can train someone who’s a business intelligence engineer easily to go do something else like HVAC, or be a nurse. So you have the entire tech industry basically folding in on itself trying to win the rat race and get the few remaining jobs left over…
But it should be pretty obvious that you can’t run an entire society with no jobs. Because then people can’t buy groceries, groceries don’t sell so grocery stores start hurting and then they can’t afford to employ cashiers and stockers, and the entire thing starts crumbling. This is the future of AI, basically. The more we automate, the less people can do, so they don’t have jobs and no income, not able to survive…
Like, how long until we realize how detrimental AI is to society? 10 years? 15?
For now, I work in AI.
IMO, using AI to remove jobs is the business equivalent of the Darwin Award. No sane executive will look at AI and see job replacement. A dumb executive will look at AI and see more productivity gains. A smart executive will see AI as a way to improve tooling for workers that explicitly want to use AI.
Sadly, as with most tech improvements, we’ll see lots of companies run by stupid people try to do stupid things with it. The best we can hope for is that there are opportunities for people to bail and find better job opportunities when their employer says “let’s fire HR and replace with GPT”, only to get absolutely brutalized by legal fees when their AI HR decides to fire someone for a protected reason, or refuses to fire a thief because they have a disability, or something that requires human intervention that doesn’t exist, or one of the hundreds of ways that it could go hilariously wrong.
It happens all the time. I remember watching solid profitable tech companies pivoting to delivering large apps on the new iPhone app store because “it’s the future”, only to realise that spending two years to develop an office suite for the iPhone 4 was a fucking stupid idea in hindsight. I remember people firing web developers because WYSIWYG editors would mean that you could design and build a website in the same way you create a Word doc. Stupid execs will always do stupid shit, and the world will move on.
Yup.
Some guys I know who worked at a developer contracting house (that I briefly worked for as well) all lost their jobs over the course of a year or so, as the company started rapidly downsizing because “Copilot means we don’t need as many developers anymore, we can fill orders with a skeleton crew.”
I’m excited to see that company fail for their bullshit.
People had the same fears about cars, the internet, computers, telephones, the printing press, and even just books and reading/writing.
It capitalism. Capitalism will replace you with a machine. AI is just a tool.
What do you think happened to building full of engineers designing plans and making stress load calculations? What do you think happened to switchboard operators?
how long until it’s not making the right people money anymore? Sometime after that.
Society can exist without jobs, not everything has to be capital, in fact reaching a post scarcity world is needed for communism.
AI hype is also overblown as fuck, I remember watching the CGP grey video Humans Need not Apply, like what, 8 years ago? Haven’t really achieved some epic breakthrough did we?
For me from a software engineers perspective, “AI” is nothing but a productivity tool, it reduces the amount of mundane work I have to do, but then so does the IDE I use.
as humans we have been automatic tasks for a long time, just think about your washing machine, you have any idea how hard it would be to have clean clothes without them? Do you think we would be better off if we needed cleaning services that clean our clothes for us using human labour just so people have jobs? Or is it better to use that effort elsewhere?
This is the part of the AI conversation that always bugs me. People have just concluded that the hype is real and we’ve reached the point that people fear in movies. They don’t understand that it’s mostly bullshit. Sure, the fancy autocomplete can toss up some boilerplate code and it’s mostly ok. Sure, it saves me time scrolling through StackOverflow search results.
But it’s simply not this all-knowing miracle replacement for everything. I think everyone has been conditioned by entertainment to fear the worst. When that bubble bursts, IT will be the part which wreaks havoc on the economy.
“AI” returns mathematically plausible results from its tokenized training data. That is the ONLY thing it does. It doesn’t consider, it doesn’t fact check itself. “AI” in its current state is a party trick.
Just to be clear, I’m not saying it isn’t useful, I’m just tired of hearing people say that the code “thinks”.
No matter what, it helps me incredibly.
They’re starting to add options to cite references, consult documentation, some of the engines actually check their source code to make sure it’s viable.
Now that they’ve hit stumbling blocks on organically improving, all those things you’re talking about can be done with conventional techniques.
Your last sentence diminishes the value of the first sentence. These LLMs save me a ton of time and massively increase my productivity.
It’s saving me a hell of a lot of man hours on incredibly tedious tasks that would require looking up individual items in a wiki or the like and then directly populating the answers into a spreadsheet… Our team doesn’t have the budget to hire someone to do it, so it basically just wouldn’t get done without it.
Useful party trick for me!
I can answer that. We won’t.
We’ll keep iterating and redesigning until we have actual working general intelligence AI. Once we’ve created a general intelligence it will be a matter of months or years before it’s a super intelligence that far outshines human capabilities. Then you have a whole new set of dilemmas. We’ll struggle with those ethical and social dilemmas for some amount of time until the situation flips and the real ethical dilemmas will be shouldered by the AIs: how long do we keep these humans around? Do we let them continue to go to war with each other? do they own this planet? Etc.
Assuming we can get AGI. So far there’s been little proof we’re any closer to getting an AI that can actually apply logic to problems that aren’t popular enough to be spelled out a dozen times in the dataset it’s trained on. Ya know, the whole perfect scores on well known and respected collage tests, but failing to solve slightly altered riddles for children? It being literally incapable of learning new concepts is a pretty major pitfall if you ask me.
I’m really sick and tired of this “we just gotta make a machine that can learn and then we can teach it anything” line. It’s nothing new, people were saying this shit since fucking 1950 when Alan Turing wrote it in a paper. A machine looking at an unholy amount of text and evaluation based on a new prompt, what is the most likely word to follow, IS NOT LEARNING!!! I was sick of this dilema before LLMs were a thing, but now it’s just mind numbing.
AI developers are like the modern version of alchemists. If they can just turn this lead into gold, this one simple task, they’ll be rich and powerful for the rest of their lives!
Transmutation isn’t really possible, not the way they were trying to do it. Perhaps AI isn’t possible the way we’re trying to do it now, but I doubt that will stop many people from trying. And I do expect that it will be possible somehow, we’ll likely get there someday, just not soon.
The more we automate, the less people can do, so they don’t have jobs and no income, not able to survive…
Most solutions to this issue usually involve some variant of a universal basic income. However, that gets politically boiled down to “MOAR TAXES GOVERNMENT IS STIFLING THIS COUNTRY!1!1”, so in countries like the US that want to keep the freedom of being able to be homeless and starving, it’s not going to be possible.
Depends on your definition of “we”…
Is AI really only detrimental to society? We’re in the initial stages where they promise the world in order to get investors attention. But once the investors realize what it’s actually capable of they’ll have to focus on what it’s actually capable of.
I think sometime next year we’ll have a crash, and all the companies pushing AI will be forced to either focus on quality, or find the next thing to push.
Not AI TV’s with Bluetooth connection to your phone! Those will be totally fine! Go ahead and say things about Trump and then go to Amazon and search for the grass trimmer you love. Go ahead and talk about the truck you like or the computer ram you need. So you work for Costco? Hmmm tells us more? Are you at the executive level? You wouldn’t be a purchaser maybe 🤔? Or what?
Distracted by media and a market of commodities
We’re just resources, units for their economy
And they want technology that’ll make us obsolete
I mean why pay for workers when you can automate machines
yeah we’re being ruled by other human beings
who seem to have forgotten what that means
we’re hamsters on a wheel we’re a human fucking farm
And they’ve worked us to the bone we’re all weathered and wornhttps://ludlowpdx.bandcamp.com/track/times-new-roman
Every Empire on this Earth has fallen…and Floating very much is not flying. 🎵
And so as to not leave you with a Gordion Knot:
The structures of our state economies are going to matter in terms of protecting democracies, and by that I mean if you look at economies that were based in the kind of small producer economies like New England was vs states like the South and the American West that were always built on the idea of very high capital using extractive methods to get resources out of the land either cotton or mining or oil or water or agri business, those economies always depend on a few people with a lot of money, and then a whole bunch of people who are poor and doing the work for those Rich guys – and that I’m not sure is compatible in terms of governance without addressing the reality that you know if people have more of a foothold in their own communities, they are then more likely to support the kinds of legislation that Community [Education, Healthcare, …] and that may be the future of democracy, if not a national democracy.”
^ https://youtu.be/D7cKOaBdFWo?t=2139 Heather Cox Richardson, professor of American history
“Practicing mutual aid is the surest means for giving each other and to all the greatest safety, the best guarantee of existence and progress, bodily, intellectually and morally.”
Mutual Aid By Pëtr Kropotkin https://thereitis.org/kropotkins-mutual-aid/ https://theanarchistlibrary.org/library/petr-kropotkin-mutual-aid-a-factor-of-evolution https://theanarchistlibrary.org/library/anarcho-mutual-aid-an-introduction-and-evaluation
Solidarity Economies, and Mutualism, will be the way forward.
To follow Corporate and their bought-out state institutions is to to walk willing into one’s own ruin.
Most importantly “AI” doesn’t exist.
But it’s also worth nothing that absolutism is almost never helpful. I don’t think data, statistics, computers, etc. are inherently evil technologies. It’s the usual problem of how capitalism directs research and development towards violent control instead of liberation.
General Artificial Intelligence doesn’t exist - we don’t have HAL9000 or Terminator or Cortana yet.
But up to that point, and almost certainly even past it, the AI effect means the more sophisticated AI things become, the more people think “well </insert ai thing/> isn’t actually intelligent or an AI”.
As Larry Tesler says: “AI is whatever hasn’t been done yet.”
Also, this recent classic: I will fucking piledrive you if you mention AI again was really illuminating.
I don’t get the point of the comic, what happens to the money part?
That’s already been going to the wrong people for decades now.
The least drastic solution would be something like UBI, where a lot of people would be miserable, but at least will be able to put food on the table. (In case you’ve seen The Expanse series, I imagine that something like the part where Bobbie asks for directions on Earth).
A more drastic solution would be to not tie the worth of people to the amount of work they do or the amount of wealth they have.
I don’t disagree with most things. But I don’t think the celebration of not having a job muddles a bit the point. I don’t see a viable future if everyone does the same.
I see you point; but not even 200 years ago the people couldn’t imagine most people working in other “industries” than agriculture.
Historically, most people worked in agriculture. (I’m not sure of the percentage, but it was >80% IIRC, but we can take a low estimate at 50%).
Nowadays less than 5% of the world population works in agriculture, due to increases in automation (machinery that can plow and harvest), and better understanding of the process (more efficient use of land).
While some of that turned out to be bad for the environment (who knew biodiversity is good, actually?), it did free up most of the population to do other things.
I hope it’s not “AI” that will automate the future (because of the huge energy costs to the environment), but automation more generally could help us free more time for passionate pursuits.
Jobs like software engineer didn’t even exist a century ago, and who knows what kind of new jobs will be created in the next 100?
I’m also an engineer and I read this as… agriculture is a house of cards, any fuckup and we’re all dead. Thanks AI and automation!
deleted by creator
At some point society will need to realize that traditional work that is handled by automation (whether AI or not) isn’t necessary and economic systems will have to change.
I’m not an expert by any means, and I just don’t see this happening in the near-term. My opinion is that for now (the short-term at least) it’ll just widen the gap between rich and poor.
Yeah, industrialization didn’t end the world and complete automation won’t either unless we decide to roll over and die instead of changing things so people benefit from the automation instead of suffering because of it.
Automation should be a good thing. If we can have things that need to happen be done more efficiently with less work we absolutely should. But we should distribute the results of those efficiency gains fairly, which is where the current system fails.