Inside almost every arcade cabinet is a Dell Optiplex running Windows 7, or 10 if its really recent. There’s no such thing as an arcade board anymore, they’re all Dells, or sometimes those HP mini PCs, usually with the protective plastic still on.
Daytona even uses a Raspberry Pi to control the second screen. SEGA intentionally ships those with no-brand SD cards that consistently fail after 3 months. It’s in their agreement that you’ll buy another card from them instead of just flashing the image onto an SD card that won’t break.
The Mario Kart arcade cabinet uses a webcam called the “Nam-Cam” that is mounted in a chamber with no ventilation, which causes it to overheat and die every few months, so of course you’ll have to replace those too. The game will refuse to boot without a working camera.
Oh yeah also all arcade games with prizes are rigged. All of them. We literally have a setting that determines how often the game will allow wins.
It’s in their agreement that you’ll buy another card from them instead of just flashing the image onto an SD card that won’t break.
Sounds like it’d be pretty simple to just replace it and not tell them. If they tell you they know it should’ve broken down by now, just ask, “Why, did you intentionally sell me something defective?”
Psst, that’s another secret
Oh yeah also all arcade games with prizes are rigged. All of them. We literally have a setting that determines how often the game will allow wins.
One time on vacation, my little sister and I found a crane game in the game room of our hotel that was clearly over tuned - basically every button press was another win, it was great. We still remember it fondly. A stupid thing, but even at that age we knew these are usually scams and we we’re stoked to just basically get cheap toys.
If I was to open up a classic video.game arcade and run it entirely on downloaded roms is someone coming to take me down?
Yes. You have to have a license to charge people money to play those games.
Otherwise you would have seen a ton of arcades open already
Edit: I only know this because I asked a guy who ran one. His machines were in pretty bad shape and I inquired why he didn’t just do as you thought.
You have to own the game to legally download a ROM. Can’t say other than that.
Optiplex is fucken goated stg
I worked in an arcade in the 90s. Wow have things changed. I bet pinball games are not easily fixable anymore either.
so, is there a way to duplicate one of the cards that allows unlimited plays or admin access?
The easier way is to just get an arcade emulator. Run it on your Desktop PC or make your own emulation machine with a Raspberry Pi. Although you need the game images (called Roms), which you can legally read and backup with original copies, or you can put on your pirate hat and find them on the internet, or so I’ve heard, allegedly.
That last paragraph sounds like something that is braking entire sets of laws and a big lawsuit waiting to happen
What you says is unlikely, since every arcade I want to still used the same machines as in 2005.
The USA is run by unpaid 22 year old interns being supervised by underpaid 24 year olds.
Old people in charge are definitely a problem (McConnell, Feinstein etc) but the people in their offices doing all the heavy lifting are basically children.
I mean, people in their 20s have done some pretty amazing things.
Yeah but most people aren’t Alexander the Great or Mozart. And even if you are, you’re probably not working in congress, hah
Alexander had Aristotle to tutor him. If you find yourself young and in power, you better hope your elder advisors are that good.
Best I can do is Blippi sorry
(Google him if you haven’t, he’s hypnotic)
Nice
Software Engineering. Most software is basically just houses of cards, developed quickly and not maintained properly (to save money ofc). We will see some serious software collapses within our lifetime.
deleted by creator
Then you start off fresh going “this time it’s going to be different” but the same fucking things happen and you end up cramming that project in 3 weeks.
In the news this week: https://publicapps.caa.co.uk/docs/33/NERL Major Incident Investigation Preliminary Report.pdf
This is unprecedented since, well, January: https://en.wikipedia.org/wiki/2023_FAA_system_outage
Y2038 is my “retirement plan”.
(Y2K, i.e. the “year 2000 problem”, affected two digit date formats. Nothing bad happened, but consensus nowadays is that that wasn’t because the issue was overblown, it’s because the issue was recognized and seriously addressed. Lots of already retired or soon retiring programmers came back to fix stuff in ancient software and made bank. In 2038, another very common date format will break. I’d say it’s much more common than 2 digit dates, but 2 digit dates may have been more common in 1985. It’s going to require a massive remediation effort and I hope AI-assisted static analysis will be viable enough to help us by then.)
My dad is a tech in the telecommunications industry. We basically didn’t see him for all of 1999. The fact that nothing happened is because of people working their assess off.
My dad still believes the entire Y2K problem was a scam. How do I convince him?
Well my dad does too and he worked his ass off to prevent it. Baby boomers are just stupid as shit, there’s not really much you can do.
deleted by creator
My dad had to stay in his office with a satellite phone over new years in case shit hit the fan.
How many UNIX machines in production are still running on machines with 32-bit words, or using a 32-bit time_t?
How much software is still running 32 bit binaries that won’t be recompiled because the source code has been lost together with the build instructions, the compiler, and the guy who knew how it worked?
How much software is using int32 instead of time_t, then casting/converting in various creative ways?
How many protocols, serialization formats and structs have 32 bit fields?
Irrelevant. The question you should ask instead is: how many of those things will still be in use in 15 years.
Windows, Linux, FreeBSD, OpenBSD, NetBSD, and OSX have all already switched to 64 bit time.
So they have a year 202020 bug then
I get the joke, but for those seriously wondering:
The epoch is Jan 1, 1970. Time uses a signed integer, so you can express up to 2^31 seconds with 32 bits or 2^63 with 64 bits.
A normal year has exactly 31536000 seconds (even if it is a leap second year, as those are ignored for Unix time). 97 out of 400 years are leap years, adding an average of 0.2425 days or 20952 seconds per year, for an average of 31556952 seconds.
That gives slightly over 68 years for 32 bit time, putting us at 1970+68 = 2038. For 64 bit time, it’s 292,277,024,627 years. However, some 64 bit time formats use milliseconds, microseconds, 100 nanosecond units, or nanoseconds, giving us “only” about 292 million years, 292,277 years, 29,228 years, or 292 years. Assuming they use the same epoch, nano-time 64 bit time values will become a problem some time in 2262. Even if they use 1900, an end date in 2192 makes them a bad retirement plan for anyone currently alive.
Most importantly though, these representations are reasonably rare, so I’d expect this to be a much smaller issue, even if we haven’t managed to replace ourselves by AI by then.
Omg we are in same epoch as the butlarian crusade.
Butlarian crusade
Butlerian Jihad, my dude. Hate to correct you, but the spice must flow.
Im just glad you got that reference
If you’re going to correct people about Dune quotes, at least use one from the book! “The spice must flow” doesn’t appear in any of them, it’s a Lynch addition.
an end date in 2192 makes them a bad retirement plan for anyone currently alive.
I can’t wait to retire when I’m 208 years old.
Tell that to the custom binary serialization formats that all the applications are using.
Edit: and the long-calcified protocols that embed it.
I am taking the week off, family camping, and cell phones off for that week in 2038.
What is the basis for the 2038 problem?
The most common date format used internally is “seconds since January 1st, 1970”.
In early 2038, the number of seconds will reach 2^31 which is the biggest number that fits in a certain (also very common) data type. Numbers bigger than that will be interpreted as negative, so instead of January 2038 it will be in December 1901 or so.
Huh interesting. Why 2^31? I thought it was done in things like 2^32. We could have pushed this to 2106.
Signed integers. The number indeed goes to 2^32 but the second half is reserved for negative numbers.
With 8 bit numbers for simplicity:
0 means 0.
127 means 127 (last number before 2^(7)).
128 means -128.
255 means -1.Why not just use unsigned int rather than signed int? We rarely have to store times before 1970 in computers and when we do we can just use a different format.
Because that’s how it was initially defined. I’m sure plenty of places use unsigned, which means it might either work correctly for another 68 years… or break because it gets converted to a 32 bit signed somewhere.
As an everyday user of software who’s not a developer, this is not a secret. Nothing works well for any extended period of time.
Because it fit into an ecosystem of tech that is constantly evolving. Software as a whole evolves more quickly than most tech. You see the same effect in every other branch of engineering but just slower.
Example: They are having problems rebuilding a certain famous church in Europe that burned down because the trees that went into it are now all smaller. They can’t get a replacement part.
I just dealt with this about a month ago at work. A customer machine died and they wanted “an exact replacement”. I explained to sales that is all I need to hear to know this project is going to be a disaster. Parts go out of stock, the network stuff is not as backwards compatible as people think it is, and standards change. They went over my head and demanded the same machine. I get daily emails from our fabricators about the problems they are having. Engineering is not a once and done thing. You need to have the staff and resources to continue to make your product match up with the environment it is in.
Package management is impossible. When a big enough package pushes an update the house of cards eill fall. This causes project packages with greatly outdated versions to exist in production because there is no budget to diagnose and replace packages that are no longer available when a dependency requires a change.
Examples: adminJs or admin bro… one of them. Switched the package used to render rich text fields.
React-scripts or is it create react app, I don’t recall. Back end packages no long work as is on the front end. Or something like that? On huge projects, who’s got the budget to address this to get the project up to date?
This has to be a world wide thing. There is way to many moving targets for every company to have all packages up to date.
It’s only a matter of time before an exploit of some sort is found and who knows what happens from there.
Does leftpad count as a collapse?
[in the US] your insurance dictates your healthcare, not your disease, deformity, symptoms etc. If your insurance pays for an allergy test, you’re getting an allergy test (even if you came in for a broken arm). If insurance pays for custom orthotics, you’re getting custom orthotics (even if you came in for a wart removal). We will bill your insurance thousands of dollars for things you don’t need. We’re forced to do it by the private equity firms that have purchased almost all of American healthcare systems. It’s insane, it’s wasteful. The best part is the person who needs the allergy test or the custom orthotics can’t afford it, so they don’t get the shit we give away to people who don’t need it.
I would gladly kill myself if it meant we got universal healthcare, but private equity firms can’t monitize a martyr so it would be pointless.
Fuck everything about the current US healthcare system. The US can be so much more, can be so much better, if we could somehow just make a single percent stop fucking over the other 99%
deleted by creator
deleted by creator
As a paramedic, if you can’t remember your name, address, and social security number, we’ll take you to the hospital but you probably won’t get a bill. Unless you tell the hospital, then we’ll get a face sheet. Stay Safe, John and Jane Doe.
So if the paramedics take me to the hospital for a broken leg or something… and I claim that I don’t remember any of my identifying information, they’ll just treat my leg and let me go? They won’t keep me around to get to the bottom of my sudden amnesia?
Make up you name, address, and birthday and say you never memorized your social.
Infinite medical care glitch
What if you have your ID on you?
Take a page from Frank and Charlie’s playbook and learn how to eat shredded credit cards / licenses.
To shreds you say?
But how do I shred them first?
Just eat a shredder first
Pretty sure Mac has done that a few times too
The hospital will treat anything life threatening and then let you go.
New home construction materials are the lowest possible quality that will meet specs. The allure of a new coat of paint and modern design masks the cheap quality and low durability. Some doors are basically slightly stronger cardboard. My theory as to why American homes have gotten so huge is that for the same budget you can get a much larger volume of materials than in the past.
There is a golden period from about 1985 to 2000 where houses were built without asbestos but with real building materials. I only buy property built in this window.
Every property I’ve inspected built after 2010 that’s more than 5 years old is either splitting at the seams, sinking into the ground or both. They’re built from polystyrene with a coat of plaster. They’re built to palm off to naive new homeowners who don’t understand or landbankers who don’t give a fuck and I pity anyone trying to live in one for more than a few years.
My parents just sold their rock solid old house to have a new one built and I was so pissed off. Now I’m going to have to inherit this piece of shit when it’s falling apart. It’s less than a year old and already has a ton of issues they’re just living with because the builder refuses to fix anything and they apparently signed something that says there’s nothing they can do about it.
Housing cost still rising tho :/
Never buy a brand new home. Get one that’s at least ten years old. All the mistakes made during construction will have been found and hopefully fixed correctly. It’s still new enough to not have a lot of the old code issues that crop up in pre 1990s houses
How online ads actually work.
Very simplified TLDR: you visit a news site. They load an ad network and tell it “put ads here, here and here”.
The ad network now tells 300 companies (seriously, look at the details of some cookie consent dialogs) that you visited that news site so they can bid for the right to shove an ad in your face.
One of them goes “I know this guy, they’re an easy mark for scams according to my tracking, I’ll pay you 0.3 cents to shove this ad in their face”. Someone else yells “I know this guy, he looked at toasters last week, I want to pay 0.2 cents to show him toaster ads just in case he hasn’t bought one yet.”
The others bid less, so that scam ad gets shoved in your face.
That’s extremely simplified of course. https://en.wikipedia.org/wiki/Real-time_bidding has a bit more of an explanation.
::hugs my PiHole::
It’s a good start but you absolutely want in-browser ad blockers too. Not all crap is served from dedicated garbage serving hostnames.
I recommend: LibreWolf + uBlock Origin for Desktop PCs and Mull + uBlock Origin for Android. Both web browsers are security hardened versions of Firefox.
I’d be interested in the amount of electricity that gets wasted on this
My guess is that it’s a couple watts while you’re actively using the internet, mostly due to the extra CPU load a few bad ads cause when they’re on your screen. Without having done the math I expect all the servers, data transfer etc. to be negligible, on a per-user basis, because they serve so many users.
That’s another interesting thing btw. Most of the “internet thing X uses Y amount of electricity” are utter bullshit and massively exaggerating. What uses most power on desktop/TV is the screen. The second biggest consumer is likely your router (which is on whether you use it or not, but the studies usually ascribe all of the standby usage to your active usage - this makes sense if you try to look at “how much CO2 does all our digital stuff including ‘having an Internet connection’ cause” but not if you’re trying to look at “how much extra CO2 does activity X cause, assuming I already have an internet connection because I’m not gonna live in a cave”).
Don’t the fans use a lot of power? And wouldn’t a datacentre or server need a lot of cooling?
Keep in mind, one of the reasons we use data centers because cooling one big room of computers is cheaper than cooling 200 small rooms with computers.
The server uses a kilowatt of power or more (most of it in the CPU). But if the server is serving 1000 active users concurrently, and only 5% of the time you spend online is spent fetching ads, 20000 people staring at their screens get their ads from let’s say 2 kW of server power usage, plus another 2 kW for all the equipment to get the data there… for a total of 0.4 watts per user.
These are completely eyeballed numbers, and could easily be off by an order of magnitude.
But your on premise gear (screen, computer, router) are likely by far the biggest factor.
One easy way to cross-check power usage claims is cost. It will only catch the most egregious bullshit, but it’s easy. A random page I found claims that “According to the American Council for an Energy-Efficient Economy it takes 5.12 kWh of electricity per gigabyte of transferred data.”
A Steam game with 50 GB would thus consume 256 kWh. Even if your 300 watt idle gaming rig, 50 Watt Router and 150 watt screen to watch the progress bar spends 2 hours downloading that, that’s 1 kWh. Even at 8 cents per kWh, that means just downloading the game would cost someone (not you) over $20. Do you think steam would let you delete and redownload that game that you bought on sale for $10 as much as you want if between them and your ISP someone had to pay for $20 just in electricity, each time? Not the game rights, not the servers, not the connection, just power.
Thanks, this makes a lot of sense.
I heard that (at least on YouTube) it isn’t only how high people bid but how likely someone is to click on your ad. Like if you have an ad they’re likely to click on you may get shown even if you bid less. You probably know more about it, I’m just sharing this because it sounded fascinating when I heard about it.
Those are different models. Ads can be sold pay per view, pay per click, or even pay per conversion (the store reports when the customer buys something and only pays for that).
These can be converted by multiplying with the estimated probabilities. For example, if the scammer is willing to pay $1 for the click, and the probability that the user will click is estimated to be one-in-500, the view would be worth 0.2 cents.
If the scammer is willing to pay $20 for the conversion (because it means they successfully scammed someone out of $30), they’d need to succeed scamming one in 20 users that clicked for this to work out.
Works the same for legit businesses of course, where the business will consider total lifetime value (not just the current sale - you might also subscribe to something and keep paying for 2 years, or come back to buy again). Advertising / customer acquisition costs are a huge part of many businesses, which is why running online ad platforms is so obscenely profitable.
In this case, I don’t know who in the chain will do the conversion - if the bid will be for a click and the ad platform will estimate how likely you are to click, or if the bidder makes the guess and bids based on that. The bidder in this case would be another ad platform of course, acting on behalf of the actual advertiser, and nobody in this “ecosystem” trusts each other. It’s full of companies trying to scam each other or companies offering services to validate that the data someone is feeding you is real.
And how you’re tracked online. I’ve worked on Google ads accounts every day for a decade and I don’t see you,the user, and your data.
I just click “female, 50+, likes home decor, uses a phone” and then a little business I work with bids 10% extra on you because they think you might be interested in their new autumn wreaths they’re super proud of, and Google think you fit that box I ticked.
And that’s advanced marketing for most businesses. Most businesses won’t even get into the audience side of things and they’ll stick to keywords: they’ll show you an ad because you searched for “autumn home decor” and that’s all.
Google take advantage of most advertisers by saying "let us be in charge of your keywords, and how much money you spend, our AI is smarter than you and you don’t have time!"And most businesses just use the automatic stuff because they don’t understand it, and it’s true, they don’t have time… so then Google takes your “autumn wreath” keyword and shows your ads to someone looking for “Christmas trees”, because they’re both seasons and they’re both plant related, right?
And then the small business gets charged $1 by Google to show their autumnal page to someone who wasn’t interested and left right away.
My job is to help these businesses actually make an advertising account that doesn’t fall for all these little bear traps that Google sets all over their ads interface. They weren’t there 7 years ago, but things have been getting worse and worse. Including third party sales companies like regalix, hired by Google to constantly call you and telling you to trust the automation and spend more.
It’s fascinating that the enshittification is taking place on both ends of Google. I would have thought that the slow bastardization of search was for the benefit of advertizers but it’s bad for everyone except Google.
That was always part of the enshittification formula. The final stage after exploiting users is to exploit business customers to the breaking point.
I used to be a funeral director. The majority of outsiders were unaware of pretty much everything we did. Often on purpose because thinking of death is uncomfortable.
The biggest “secret” is probably that the modern funeral was invented by companies the same way diamond engagement rings were. For thousands of years the only people who had public funerals were rich and famous. It was the death of Abraham Lincoln that sparked the funeral industry to sell “famous people funerals at a reasonable price”. You too could give your loved one a presidential send off! The funeral industry still plays into this hard, and I’ve found many people are simply guilt tripped by society to have a public funeral.
Not so fun story:
One of my first jobs when I was barely 18 was with one of the big funeral home/cemetery providers in the US. It was positively horrible, and not for the reasons most people think.
As a new hire, you’d start on the cold-calling phone banks, which was bad enough. Nobody wants a cold marketing call from a cemetery. But it got worse from there.
After a month on the phone bank, I’d done well enough to be promoted to field sales, which meant going to the most impoverished areas of town to follow up on the appointments the phone bank had made, basically trying to scare poor elderly people into handing over what little they had to ‘pre-plan’ for their deaths, with the pitch that if they didn’t, their family would suffer.
After a few appointments it was clear I didn’t have the stomach for that, so they moved me to on-site sales, which was somehow worse.
On-site sales included helping to host the Mother’s Day open house at the large main cemetery. They set up a greeting station at the entrance with refreshments and ‘in memorium’ wreaths that could be bought by bereaved family (on that day, mostly children of the deceased, but also mothers who had lost their children, some at a very young age). It sounds like a kind thing to do, because many young mothers/fathers coming to visit were so distraught, they hadn’t stopped for coffee or thought about flowers.
I was not stationed at the welcome station. I was a ‘roamer’, meaning I was one of several staff expected to meander through the graves and check on families graveside – to ask if they needed anything and to upsell them pre-planning packages for themselves or their other children. I am not kidding, we were expected to do that.
I had to be prodded to approach my first mark (a young couple ‘celebrating’ the woman’s first Mother’s Day at the grave of her several months old child, and I couldn’t stomach it. It felt barbaric, to even try to sell someone who could not stop crying at the grave of her young child. I couldn’t do the pitch, obviously, and backed out as soon as possible, then hid by the skips behind the main building until the end of the day when I quit.
I’ve done many jobs in my life, including cleaning bowling alley toilets, but I’ve never been asked to do anything as vile.
I’ll bet everyone in the funeral industry can guess which company I’m talking about.
I also had the pleasure of working for Service Corporation International. Thankfully solicitation of funeral services is banned in Ontario, Canada. So no cold calling or bugging people at cemeteries. Their way around it was to hold seminars about Last Wills at places like retirement homes. If someone had a funeral related question the staff would get them to sign a form agreeing to a phone call or visit from a sales person.
The pre-arrangement sales people were all on commission and it made them very pushy. The pitches were so manipulative I couldn’t listen to them. Our government is throwing around the idea of banning commissioned sales in funeral services as well because of it. Some other Canadian provinces have already banned it.
Their practices are so scummy, I’m surprised they’re still allowed to operate at all in Canada. Glad they can’t do their worst in Ontario, that’s a small win.
You’re right about their abhorrent manipulation – I still have binders in storage from my sales training; I should dig them up and post some of it. It’s still, 35 years later, the most disgusting emotional manipulation I’ve ever seen. After all these years, it’s only got worse in the US from what I hear.
You were supposed to ask them to relive their most recent familial death experience under the guise of polite conversation, then hone in on whatever detail was the most unpleasant, and hammer home how if they didn’t buy a package, their children would go through worse. Have they considered how much emotional and financial pain they would cause if, god forbid, they died tomorrow? Don’t take time to think about the money you don’t have, because every hour of delay raises the chances your kids will be left with a financial mess when they’re grieving you. You’re basically heartless for doing that to them.
The graveside pitch was even worse. It’s so sad you lost your baby last month, but what if your six-year-old died tomorrow? Are you prepared for that? Like jesus, I can’t imagine the paranoia a grieving family faces after losing one child, constantly afraid for their remaining child. Let’s rub salt in that wound and scare the shit out of them for a few thousand dollars. It should be illegal everywhere.
What do you mean by “public funeral”? What’s the alternative? It sounds like you’d consider an event with only friends and family where there was a coffin in a room to be a “public funeral”. That seems to be what most people have, but it isn’t very public. Is a non-public funeral one where the family makes the coffin themselves and there’s no event where people see the dead person and the coffin?
The minimal services are essentially transportation, government documentation, and disposition (cremation, burial, entombment, etc). Some funeral homes won’t charge for a private viewing by immediate family, some charge a small fee. Typically there’s a cap on number of people and amount of time, something like 10 people total for 30 minutes.
Anything more than that will require you pay thousands of dollars extra. Hours of receiving guests, a published obituary, a mass or ceremony, musicians, clergy/celebrants, reception. All of those are pushed as “traditional” or expected but they’re incredibly expensive.
deleted by creator
You didn’t talk about how coffins are sold for many thousands of dollars when they are just cheap plywood boxes that shouldn’t cost more than a hundred bucks and that serve no purpose other than to decay as quickly as possible.
While I do think expensive caskets are a waste of money, they’re actually one of the least marked up products sold at a funeral home! Typically, caskets and urns are sold for twice what they’re bought for wholesale. This is mostly because anyone can sell caskets and urns so they can’t have ridiculous markups or people will go elsewhere for them. Urns for example are almost always bought off Amazon instead of at a funeral home.
The products with the highest markups were insurance based. Estate Fraud insurance (if someone steals the dead person’s identity, the insurance company will pay any costs involved in correcting it) and Travel insurance (if you die on vacation, the insurance company will pay any costs involved in bringing the body home). Both of these insurance policies had real costs of about $10 or $20. They’re often sold for $300 to $500.
Technically not my industry anymore, but: companies that sell human-generated AI training data to other companies most often are selling data that a) isn’t 100% human generated or b) was generated by a group of people pretending to belong to a different demographic to save money.
To give an example, let’s say a company wants a training set of 50,000 text utterances of US English for chatbot training. More often than not, this data will be generated using contract workers in a non-US locale who have been told to try and sound as American as possible. The Philippines is a common choice at the moment, where workers are often paid between $1-2 an hour: more than an order of magnitude less what it would generally cost to use real US English speakers.
In the last year or so, it’s also become common to generate all of the utterances using a language model, like ChatGPT. Then, you use the same worker pool to perform a post-edit task (look at what ChatGPT came up with, edit it if it’s weird, and then approve it). This reduces the time that the worker needs to spend on the project while also ensuring that each datapoint has “seen a set of eyes”.
Obviously, this makes for bad training data – for one, workers from the wrong locale will not be generating the locale-specific nuance that is desired by this kind of training data. It’s much worse when it’s actually generated by ChatGPT, since it ends up being a kind of AI feedback loop. But every company I’ve worked for in that space has done it, and most of them would not be profitable at all if they actually produced the product as intended. The clients know this – which is perhaps why it ends up being this strange facade of “yep, US English wink wink” on every project.
When your favorite band cancels their gig because the lead singer has “come down with the flu”, that’s industry code for “got too wasted, and is currently too busy getting alcohol and possibly drugs out of their system to perform”.
I even worked one show that had to end after 20 minutes because one guy in the band was visibly under the influence, refused to play, talked to his hallucinations, then spent a few minutes talking to the audience about how his foot was evil and wanted to kill him, before the tour manager could drag him off stage. Then he tried to assault several backstage staff for not allowing him to cut off his foot. This was on a tour that promoted alcohol free rockshows btw, so we didn’t provide alcohol to the artists backstage. God knows what he might’ve purchased from our local street dealers lol.
The next day in the papers, the headline says “[the band] cancels first week of reunion tour after flu outbreak” 🙃 Yes, of course
I always wondered why Paul Westerberg caught the flu so much. When I finally got to see him live a few years ago he definitely was coming down with the flu on stage.
Sysadmins have no idea what they are doing, we’re just one step ahead of the rest of you at googling stuff.
Outsourced IT provider here:
90% of businesses have basically zero IT security. Leaked passwords in regular use and no process or verification for password resets. As soon as someone complains that 2FA or password rotation is difficult it gets dropped. Virtually all company data is stored on USB keys, plaintext hard drives and on staff’s personal home devices.
The reason they’re not constantly having their data stolen is because no-one cares about the companies either.
Isn’t password rotation a horrible practice because it makes people use passwords like “MyNewPassword15” since it’s the 15th password reset they’ve been forced to do?
password rotation is generally not considered a “best practice” but not doing something because it’s not a best practice is only a good strategy if you’re actually going to follow the best practices. password rotation is less effective than a good password manager and long randomly generated passwords that are unique to each site. requiring passwords be rotated can be an impediment to using strong unique passwords, which is why it’s not a good practice.
but a freshly rotated “MyNewPassword15” is a million times better than your password being “password”, or being the same thing you use on every sketchy website whose database has been breached a dozen times.
Password rotation for your “emergency” system account (the one that shouldn’t be root) still needs to be rotated every time someone with access leaves or changes job roles.
Any password someone who leaves had access to should be roasted, no?
deleted by creator
Password rotation is very insecure. No one should be doing that. I also hate when companies set maximum length for a password, like 12-16 characters. Bitch, my 32 character password is much more secure!
I believe Microsoft’s 365 platform helps a lot in that matter. Even without any security strategy or custom configuration M365 offers a better security level than those businesses could ever reach themselves.
Which might explain why medium sized companies that are not completely clean-nosed are happy to run Windows 10 with all its spyware elements running unregulated.
It’s also terrible in the government sector, which is why the NSA’s huge database on US internet traffic is accessible to rivals like Russia, Iran and China.
Yeah cool. Got a source on that NSA comment there do you?
Many software developers care even less about security than the people who use the software. Their attitude is that it’s just more work to do things in a secure manner. It’s only after a major security breach that they fix their security holes.
many software developers
Most individuals care about security, but most companies’ reward structure does not reward proactive security measures. Alice will get a much bigger bonus if she spends 20 hours straight fixing a zero-day exploit in the wild than if she had spent a week implementing proper safeguards in the first place.
That’s not fair. I care about security a lot. But implementing security takes time, and hiring me for more hours costs more money. So most entities that need software developed want the solution that costs less and is faster to develop, they don’t really understand what “security” even means. And the reality is, if you really want security in your software, you’re not hiring a dev to make a piece of software, it is a continuous expense to keep the software patched and secured, which is not what most companies want. I’m billing for the hours either way. You just need to point me to the guy who’s willing to pay.
And I also don’t know anyone who feels incentivized to fix security holes. It’s the software equivalent of having to fix the leaky mystery toilet in a dive bar. Yes, the pay might be high, but it’s also extremely stressful and you’re taking on a lot more responsibility - because it’s already too late. Plus it puts a strain on the relationship with the customer who paid you to develop the software, even though we both know they were the ones who didn’t want to pay to prevent this in the first place. If you think I’d rather stay on high alert 24 hrs a day thursday-monday to fix some preventable shit, than be at home with my family on the weekend, you’re insane. The bonus might make it tolerable. I’d still rather not.
At least you’re creating more job opportunities for IT Security people. :P
Worth pointing out this isn’t usually down to developers choosing not to do it. But management either via direct decision making or deadlines.
It’s not that they don’t care, not at all. But when you have a road map and hard deadlines you don’t have the option. And it’s hard to sell security as a priority to leadership when the other option is features that can increase revenue.
Same thing in distribution. They promote “safety, safety, safety,” but as soon as productivity dips, “you guys aren’t hitting your numbers, you need to do better.”
It’s only after a major security breach that they fix their security holes.
I had a feeling based on constant news of data breaches.
I have worked in the gaming industry and let me tell you that in some game studios most of the people involved in making the games are not gamers themselves.
Lots of programmers and artists don’t really care about the final game, they only care about their little part.
Game designers and UX designers are often clueless and lacking in gaming experience. Some of the mistakes they make could be avoided by asking literaly anyone who play games.
Investors and publishers often know very little to almost nothing about gameplay and technology and will rely purely on aesthetic and story.
You have entire games being made top to bottom where not a single employee gave a fuck, from the executives to the programmers. Those games are made by checking a serie of checkboses on a plan and shipped asap.
This is why you have some indie devs kicking big studio butts with sometime less than 1% the ressources.
Afaik even in other “similar” industry (e.g filmmaking) you expect the director, producers and distributors to have a decent level of knowledge of the challenges of making a movie. In the video game industry everyone seems a bit clueless, and risk is mitigated by hiring large teams, and by shipping lots of games quickly.
I’ve been a game programmer for >10 years and I would be fucking miserable if I spent most of my free time with video games as well. Isn’t that what we call work/life balance? And from my experience, most game devs either stop being “gamers” at a certain point, or they burn out and quit the video game industry.
That being said, almost everyone I know from gamedev is really excited about video games, and they have a ton of experience, even if they are not playing games in their free time anymore. It could be because I’ve only worked for indie projects and small publishers.
Agree on all points, I was just making an observation. It sucks that all the people with money don’t care though.
Yeah, that explains it I think. Making video games is hard work and it is normal quit or stop caring.
I’m not sure what kind of role you had in the industry, but I’m not sure what you’re saying is entirely accurate… although there are some bits in there I agree with:
Lots of programmers and artists don’t really care about the final game, they only care about their little part.
Accurate. And that’s ok. A programmer whose job it is to optimize the physics of bullet ricochet against thirteen different kind of materials can go really deep on that, and they don’t need to (or have time to) zoom out and care about the entire game. That’s fine. They have a job that is often highly specialized, has been given to them by production and they have to deliver on time and at quality. Why is that a problem? You use the corrolary of film, and nobody cares if the gaffer understands the subtext of the Act 3 arc… it’s not their job.
Game designers and UX designers are often clueless and lacking in gaming experience. Some of the mistakes they make could be avoided by asking literaly anyone who play games.
Which one? A game designer lacking in gaming experience likely wouldn’t get hired anywhere that has an ounce of standard. A UX designer without gaming experience might get hired, but UX is about communication, intuition and flow. A UX designer who worked on surgical software tooling could still be an effective member of a game dev team if their fundamentals are strong.
Investors and publishers often know very little to almost nothing about gameplay and technology and will rely purely on aesthetic and story.
Again, which one? Investors probably don’t know much about the specifics of gameplay or game design because they don’t need to, they need to understand ROI, a studio’s ability to deliver on time, at budget and quality, and the likely total obtainable market based on genre and fit.
Publishers – depending on whether you are talking about mobile or console/box model – will usually be intimately familiar with how to position a product for market, what KPIs (key performance indicators) to target and how to optimize within the available budget.
This is why you have some indie devs kicking big studio butts with sometime less than 1% the ressources.
This has happened. I’m not sure it’s an actual trend. There are lots of misses in the game industry. Making successful products is hard – it’s hard at the indie level, it’s hard at the AAA level. I would estimate there are a thousand failed Indies for every one you call out as ‘kicking a big studio’s butt.’ Lots of failed AAA titles too. It’s just how it goes.
The same, by the way, is true of film, TV, books and music. A lot of misses go into making a hit. Cultural products are hard to make, and nobody has the formula for success. Most teams try, fail, then try again. Sometimes, they succeed.
Hey, fair points. I am not saying that all big are bad and all indies are good. The industry is definitely getting carried by indies in some genres and that is ok.
It would seem you agree on most points, as I passionate myself it just surprised me to sometime be surrounded by people who didn’t really care. It depends on the project and the studio of course. I can’t really blame the workers though as I said, so I agree with you that it makes sense in most cases to not recruit only “gamers”. Thanks for sharing!
you expect the director, producers and distributors to have a decent level of knowledge of the challenges of making a movie.
But not about the source material.
Adaptations nowadays suck ass because there’s no fans in charge.
Game designer here. I’d say there are degrees in these. Most game designers I know love to play game, but which type of game that they love? Well train designer should be able to design anything, you might say that, but most people have a thing that they keen more than the other thing.
So, they may be very well verse on some genre and might not so much on some. Now, getting a job in the right company that making the right game that you are keen of might not be that easy.
I’ve seen some young to old designers who only play certain type of games and be clueless on other type and some who can adapt their design skills to many types of games.
But game development can’t be without testing if they design something wrong, it should show up on test…
Still, sometimes you got the best devs and the test results came out surprised you.
It’s easy to point what is wrong looking from the end product pov but when the design starts with a clean drawing board it’s also very easy to miss things.
Game dev is hard man…
I 100% believe this. It doesn’t necessarily mean it’s bad work, but the way you phrased it makes a dozen or so instances of “something feeling weird” make sense. Sometimes it’s just a mismatch between the intensity of the fans fanaticism and the developer having to go to work every day and do a job.
I think game developers can harness this by embracing their modding communities. I’m currently waiting to see what Cities Skylines 2 is like. It has to be hard for a bunch of devs who seem like normies to develop a game for a bunch of nerds, some of whom know more about civil engineering and traffic planning than real engineers. :D The original was well-modded and it feels like the game was a collaboration between the community and the developers. To me, this kind of bridges the developer/gamer gap.
It was a thing I noticed regarding Ubisoft games that upper management seems to wish to make movies rather than games. It struck me home when the knife fight with Buck Hugues in Far Cry 3 was just a long chain of quicktime events like it was Dragon’s Lair from the 1980s; disappointing because Far Cry 2 was all about game mechanics telling the story rather than cutscenes. (Which was, admittedly at cross purposes. It’s a game about violence in a failed state in Africa with the futility of violence as a running theme. But it did that very well.)
Idk but I really like Valve, I believe they care about the stuff they push out.
Do you think this occurs equally across all big budget games?
No, not to the same extent. I mean past a certain size we probably shouldn’t expect big executives to care, but you still have a lot of passionate people in this industry, so you can totally have “true” gamers working in big budget games.
A lot of the same things you mention about game development are also apparent in open source software which is why it is usually so terrible. Someone that can program some complicated visuals for a 3D modeling program does not mean that same person actually does 3D modeling, which is why the interface for so many open source programs are abysmal.
I have a friend who has been coding various things for years and they are never successful because he builds interfaces he understands how to use. No one else does things his way.
Yup! That right there. You give a technical person a job that requires some level of “soft skills” and that is what you get.
This is probably true of many many other industries. I work in automotive and while a lot of us care about delivering a quality product, the majority are not “car people” and have never changed a part on their car.
Yeah, it is kind of the default isn’t it. It kinda make sense for the programmers and artists, but it is still kinda weird that the actual designers don’t really understand why people play video games. You wouldn’t expect a movie director to not like movies, or a car designer to not like cars. I guess it must be happening everywhere at least to some degree.
Nowadays I would compare some game studios to what some boys bands were to music. You start with some guys with money who are neither musicians, nor sound engineers, nor anything really. They pick singers and musicians based on look and market research, they hire a large team of specialized workers, and then they spend millions on marketing to flood the space with their new album. The indie developers in this scenario would be Pink Floyd.
It wasn’t always like this, at least for video games. I feel like in the 80s up to the early 00s it was mostly dominated by passionate workers, but there just isn’t enough passionate workers for the demand. As the industry grew, big players started building those “soulless” projects to make good return on investment. Not to denigrate the individual contributions of the workers, but sadly the people who own those business don’t really care if they’re making games or cars or selling cigarettes. They care about r.o.i.
Loading animations on websites and some apps that give you a percentage and messages about what’s going on are usually faked with animations. The frontend for things like that usually just puts fake messages and animations because it’s not easy to track the stages of complex steps happening on the backend. It’s possible in some cases but I don’t think I have ever seen a real working version of a loader like that in my 15 years of experience.