Beany wrote: Mon Apr 14, 2025 6:45 pm
Yeah, but a tonne of aluminium is actually useful.
Aren't they firing back up old nuclear power stations in 'merica to supply future power needs of AI? Can't remember if it's Google, Apple or MS looking at nuclear microreactors to power their AI server farms...probably all three.
Basically, the figures in the diagram jamcg shared are probably going to exponentiate. Whereas producing your tonne of aluminium isn't going to go up in energy use, generally.
AI does have its place as a very useful tool, but the current bandwagon of every company jumping on it and using it where it's not needed, makes it a bit of a fad at the moment.
The Google AI overview annoys me. Just searching something on Google and it comes up on top with some bullshit wishy washy answer. Would be better if it only came up if you pressed a button for it.
Mito Man wrote: Mon Apr 14, 2025 8:07 pm
The Google AI overview annoys me. Just searching something on Google and it comes up on top with some bullshit wishy washy answer. Would be better if it only came up if you pressed a button for it.
Yep, it's stupid and I keep on reporting it. I had a result appear yesterday which was simply wrong. Today I had a result which said the UK consumes 12 million loaves of bread per day, with the next paragraph stating the UK consumes 11 million loaves of bread per day.
Although apparently you can get rid of it by putting a modifier (I think it was -ai) in your search?
Mito Man wrote: Mon Apr 14, 2025 8:07 pm
The Google AI overview annoys me. Just searching something on Google and it comes up on top with some bullshit wishy washy answer. Would be better if it only came up if you pressed a button for it.
Yep, it's stupid and I keep on reporting it. I had a result appear yesterday which was simply wrong. Today I had a result which said the UK consumes 12 million loaves of bread per day, with the next paragraph stating the UK consumes 11 million loaves of bread per day.
Although apparently you can get rid of it by putting a modifier (I think it was -ai) in your search?
I understand putting a swear word in causes it not to give the AI result - not sure you want results about the UK population fucking loaves of bread though.
I've already found a Chrome extension which blocks it: Hide Google AI Overviews.
This is hilarious. Someone discovered that if you type in some random sentence and then the word "meaning", Google AI will provide en explanation of the completely ficticious idiom you just invented
"if you lie down with noodles you'll get sauced meaning" generates
The idiom "if you lie down with noodles, you'll get sauced" means if you associate with someone who is known for a certain behavior (in this case, being "sauced," or intoxicated), you will likely be influenced to do the same or be affected by the behavior. It's a cautionary proverb about the dangers of bad company.
"you can't polish a bald man twice meaning" generates
The idiom "you can't polish a bald man twice" means you can't make a mistake that has already been made or fix a problem that has already been fixed. It implies that there's no point in repeating something that's already been done, especially if it's been done incorrectly or doesn't require further action. The saying highlights the futility of trying to improve upon something that is either already at its best or doesn't lend itself to further improvement.
The saying "a person with an itch has a smelly finger" is a proverb, particularly popular in Kenya, that suggests a person who has a bad habit, like scratching or engaging in a particular activity, will be noticed by the effects of that habit. An Instagram post uses this proverb, which translates to "If you go to sleep with an itching anus, you are sure to wake up with smelly fingers," to highlight the idea that one's actions, especially those related to personal hygiene, can have noticeable consequences.
The expression "farting in love from the heart" is a humorous way to suggest a deep level of trust and comfort in a relationship, where bodily functions are accepted and shared without embarrassment. It signifies a stage where partners feel so at ease that they can be their most natural selves, including releasing gas around each other.
I saw this earlier - my Bluesky feed was basically just these Sadly I have not been able to try it because I've successfully managed to stop Google giving me AI results on my phone, in all the browsers on my PC - everywhere I've tried.
Mito Man wrote: Wed Apr 23, 2025 3:29 pm
The saying "a person with an itch has a smelly finger" is a proverb, particularly popular in Kenya, that suggests a person who has a bad habit, like scratching or engaging in a particular activity, will be noticed by the effects of that habit. An Instagram post uses this proverb, which translates to "If you go to sleep with an itching anus, you are sure to wake up with smelly fingers," to highlight the idea that one's actions, especially those related to personal hygiene, can have noticeable consequences.
ChatGPT regurgitating grade 4 "Confucius say" jokes was not bingo card. Don't run behind the bus you'll get exhausted.
We’re letting a contract for a few £m of DecSecOps work and we’re having to write in clauses about not using AI - because all interactions are recycled as training info there’s a security risk of <stuff> getting out.
We agonised a lot over that in a previous role. It will be almost impossible to enforce, short of using audit rights. I don't think there are any technical controls that you can ask for that will meaningfully work. In theory whatever AI systems they have/use could be set to exclude internal locations, but that would be difficult to manage and evidence, although that would be their headache.
i’ve spent the past 2 days correcting crap sent out by others generated by chat GPT.
The first was a project director who decided to “help” the client by answering some legal contract questions. Instead of just asking me, he massively over complicated things and told the client a load of stuff that doesn’t actually apply to their contract because he just searched a few words and asked it to expand on it.
the second was my MD who decided to create “useful” matrix for grading new candidates that he clearly hadnt read through or role played out as it just didn’t work. It looked pretty though.
It’s useful, but we’re a fucking long way from AI just being able to do stuff reliably.
I still get invitations to trail legal AI tools pretty much daily. I give the senders short shrift. There's a hearing this week - https://www.judiciary.uk/judgments/alha ... ney-order/ - where the court has summoned the lawyers involved in a couple of recent cases where fake citations were included to explain themselves. I anticipate that severe sanctions will be issued.
Sundayjumper wrote: Wed May 21, 2025 9:26 am
We’re letting a contract for a few £m of DecSecOps work and we’re having to write in clauses about not using AI - because all interactions are recycled as training info there’s a security risk of <stuff> getting out.
We’ve no idea how to enforce it though.
I'd guess for a contract there'd be massive penalty clauses for evidenced use of LLM/AI generated content? And make it clear that ignorance of this would be grounds for breach of contract with further penalties etc?
It's the only way some of these pricks will understand, and it'll scare off the amateur hour techbros from even going for it.
I must admit I use ChatGPT and Claude.AI to assist writing prose and code and I find it helpful. Sometimes it is useful to give you a structure when you are staring at a blank page.
Jobbo wrote: Wed May 21, 2025 12:40 pm
I still get invitations to trail legal AI tools pretty much daily. I give the senders short shrift. There's a hearing this week - https://www.judiciary.uk/judgments/alha ... ney-order/ - where the court has summoned the lawyers involved in a couple of recent cases where fake citations were included to explain themselves. I anticipate that severe sanctions will be issued.
A first occurred recently where I was asked to review an academic paper and it quickly became apparent that AI had been used heavily in the writing. The giveaway was the references, which on the surface of things seemed legit (i.e. sounded like a journal that might exist, and were in the correct format) but either were totally fictitious or in no way supported the argument being made.
There is a lot made of using AI in medicine and I think there are some places (decision support tools, analysis of complex data e.g. images, crawling patient notes to extract usable data, modelling flow through hospitals so you know which wards to put people on, etc.) but the idea that AI is a gamechanger the way a lot of fancy startups seem to portray it is a bit laughable.