

Blasted is nothing, wake me up when they get slammed.
Blasted is nothing, wake me up when they get slammed.
I dropped this /s
It’s called a dialectic.
Mixture of experts is the future of AI. Breakthroughs won’t come from bigger models, it’ll come from better coordinated conversations between models.
Gaming is an absolutely massive economic sector, driven by the escapism of virtual worlds. The functional kernel of the metaverse is a universal game lobby, a place for people to congregate while they navigate between the games they play together.
humans can learn a bunch of stuff without first learning the content of the whole internet and without the computing power of a datacenter or consuming the energy of Belgium. Humans learn to count at an early age too, for example.
I suspect that if you took into consideration the millions of generations of evolution that “trained” the basic architecture of our brains, that advantage would shrink considerably.
I would say that the burden of proof is therefore reversed. Unless you demonstrate that this technology doesn’t have the natural and inherent limits that statistical text generators (or pixel) have, we can assume that our mind works differently.
I disagree. I’d argue evidence suggests we’re just a more sophisticated version of a similar principle, refined over billions of years. We learn facts by rote, and learn similarities by rote until we develop enough statistical text (or audio) correlations to “understand” the world.
Conversations are a slightly meandering chain of statistically derived cliches. English adjective order is universally “understood” by native speakers based purely on what sounds right, without actually being able to explain why (unless you’re a big grammar nerd). More complex conversations might seem novel, but they’re just a regurgitation of rote memorized facts and phrases strung together in a way that seems appropriate to the conversation based on statistical experience with past conversations.
Also you say immature technology but this technology is not fundamentally (I.e. in terms of principle) different from what Weizenabum’s ELIZA in the '60s. We might have refined model and thrown a ton of data and computing power at it, but we are still talking of programs that use similar principles.
As with the evolution of our brains, which have operated on basically the same principles for hundreds of millions of years. The special sauce between human intelligence and a flatworm’s is a refined model.
So yeah, we don’t understand human intelligence but we can appreciate certain features that absolutely lack on GPTs, like a concept of truth that for humans is natural.
I’m not sure you can claim that absolutely. That kind of feature is an internal experience, you can’t really confirm or deny if a GPT has something similar. Besides, humans have a pretty tenuous relationship with the concept of truth. There are certainly humans that consider objective falsehoods to be Truth.
It’s also pretty young, human toddlers hallucinate and make things up. Adults too. Even experts are known to fall prey to bias and misconception.
I don’t think we know nearly enough about the actual architecture of human intelligence to start asserting an understanding of “understanding”. I think it’s a bit foolish to claim with certainty that LLMs in a MoE framework with self-review fundamentally can’t get there. Unless you can show me, materially, how human “understanding” functions, we’re just speculating on an immature technology.
I dabble in comics and manga too, so I prefer the full screen to an e-ink display. It’s also nice for videos.
I just got a foldable. The increase in functionality of reading on my phone is substantial, and that’s such a big fraction of my phone use that I consider it worthwhile. I wouldn’t be as productive if I had to carry a bulky e-reader with me all the time, it’s incredibly convenient to be able to fold it up and put it in my pocket.
It’s a bit heavier, but I got used to it quickly. My old phone feels suspiciously light now, like a toy. The expense is certainly a factor, but for me the utility is worth it in the long run. It’s not for everyone, but there are people it makes sense for.
Bet you read that in a textbook
I was in my university’s Society of Physics Students, and some of the members got to have dinner with NDT after a talk he gave at the school. Reports confirm he is a self-centered, arrogant douchebag
I use it for generating illustrations and NPCs for my TTRPG campaign, at which it excels. I’m not going to pay out the nose for an image that will be referenced for an hour or two.
I also use it for first drafts (resume, emails, stuff like that) as well as brainstorming and basic Google tier questions. Great jumping off point.
An iterative approach works best for me, refining results until they match what I’m looking for, then manually refining further until I’m happy with the results.
I think the obvious answer is “Yes, some, but not all”.
It’s not going to totally replace human software developers anytime soon, but it certainly has the potential to increase productivity of senior developers and reduce demand for junior developers.
Move fast and break things, I guess. My take away is that the genie isn’t going back in the bottle. Hopefully failing fast and loud gets us through the growing pains quickly, but on an individual level we’d best be vigilant and adapt to the landscape.
Frankly I’d rather these big obvious failures to insidious little hidden ones the conservative path makes. At least now we know to be skeptical. No development path is perfect, if it were more conservative we might get used to taking results at face value, leaving us more vulnerable to that inevitable failure.
Her is set in 2025
“It would be nice to develop an auxiliary sign language to bridge the accessibility gap between the hard of hearing and those who don’t learn a dedicated sign”
“You’re just as bad as the colonizers that decimated native American cultures”
Get out of here with that bad faith savior complex nonsense. Teaching indigenous people English wasn’t the problem, the problem was beating children for using their native language. I guess you think literacy is racist too because literacy requirements were used to disenfranchise black Americans, huh?
Your sanctimonious colonization comments are dripping with irony. I asked a question, directly to another person, about their opinion of the concept as a deaf/hard of hearing person. You interceded uninvited, deliberately ignored the explicitly stated context of the question (gestural languages having unique properties from verbal ones) so you could shoehorn in your opinion about a topic explicitly excluded by that context, which you smugly assumed I wasn’t familiar with, purporting the relevance by referencing authors who wrote very little about the actual topic at hand.
You want to talk about colonizers, look at your own actions here.
My goalposts are in precisely the place they started: a collection of basic international gestures to facilitate the most basic communication. Where are you jumping to colonization? Where did I say that my cultural group gets to decide what the signs are? You’re, again, wildly overestimating the scope of my proposal and jumping to ridiculous, unsubstantiated conclusions.
You get a group of signers from around the world to develop an international pidgin (like they already do informally at international gatherings) and come to consensus based on commonality. When the majority agree on a sign, use it. Where there’s little agreement, choose a new sign. No finger spelling, no complex abstract concepts, just a formalization of gestures most people could probably figure out anyway. I fail to see how that perpetuates colonization unless that’s what you’re setting out to do with your methodology.
I am familiar with the regionality of language. I don’t understand your point, you’re simultaneously saying that you can’t have universal understanding, but we have gestures we instantly understand instantly so there’s no need to codify them, but they look different.
I think you’re wildly overestimating the scope of my proposal.
They rest in the boxes they came in.