To be honest, the explanation given in the screenshot makes sense. Whilst it’s frustrating, if the mods have had past problems with arguments over OSes (and there are dedicated subs for them), I can somewhat understand the reason for the rule.
To be honest, the explanation given in the screenshot makes sense. Whilst it’s frustrating, if the mods have had past problems with arguments over OSes (and there are dedicated subs for them), I can somewhat understand the reason for the rule.
There’s already talk-to-your-dog/cat products such as FluentPet. Probably the biggest issue with cats in particular is that their “vocabulary” is quite limited (usually less than a dozen distinct “meows”), but some of the FluentPet users (examples on Youtube such as BilliSpeaks) seem to suggest basic reasoning. A full-blown language is beyond them, but they do seem capable of understanding more concepts than we give them credit for.
The ultimate goal is to speak dolphin, if indeed there is such a language. The pursuit of this goal has led WDP to create a massive, meticulously labeled data set, which Google says is perfect for analysis with generative AI.
So they’re aiming for a real-life version of SeaQuest DSV? Considering Season 1 was set in 2018-2019, we’re 7 years behind schedule so far…
One can only conclude that either this is the latest step in a deliberate effort to sabotage the functioning of the US (and by extension much of the west), or just another monumentally stupid idea brought to life by their limitless incompetence.
The irony is that, according to the article, it already does. What is changing is that the LLM will be able to use more of that data:
OpenAI is rolling out a new update to ChatGPT’s memory that allows the bot to access the contents of all of your previous chats. The idea is that by pulling from your past conversations, ChatGPT will be able to offer more relevant results to your questions, queries, and overall discussions.
ChatGPT’s memory feature is a little over a year old at this point, but its function has been much more limited than the update OpenAI is rolling out today… Previously, the bot stored those data points in a bank of “saved memories.” You could access this memory bank at any time and see what the bot had stored based on your conversations… However, it wasn’t perfect, and couldn’t naturally pull from past conversations, as a feature like “memory” might imply.
Because any tech has privacy risks?
We’re no longer living in a world where the only major cybersecurity threats are CCP-backed tech and Russian hackers.
The children don’t deserve to die because of their parents.
Precisely what I am doing. Too many devices that still do what I need simply to ditch just because Windows 10 is EOL. I’m a bit over half-way in my migration (still have a few programs to sort out - may have to run a W10 VM for a couple of them as they don’t work under WINE and there is no Linux equivalent).
The browser addon “AdNauseum” can help with that, although it’s not a complete solution.
Got to provide a continuous supply of children for the labour camps.