Hacker Newsnew | past | comments | ask | show | jobs | submit | cadamsdotcom's commentslogin

Abstractions can take away but many add tremendous value.

For example, the author has coded for their entire career on silicon-based CPUs but never had to deal with the shittiness of wire-wrapped memory, where a bit-flip might happen in one place because of a manufacturing defect and good luck tracking that down. Ever since lithography and CPU packaging, the CPU is protected from the elements and its thermal limits are well known and computed ahead of time and those limits baked into thermal management so it doesn’t melt but still goes as fast as we understand to be possible for its size, and we make billions of these every day and have done for over 50 years.

Moving up the stack you can move your mouse “just so” and click, no need to bit-twiddle the USB port (and we can talk about USB negotiation or many other things that happen on the way) and your click gets translated into an action and you can do this hundreds of times a day without disturbing your flow.

Or javascript jit compilation, where the js engine watches code run and emits faster versions of it that make assumptions about types of variables - with escape hatches if the code stops behaving predictably so you don’t get confusing bugs that only happen if the browser jitted some code. Python has something similar. Thanks to these jit engines you can write ergonomic code that in the typical scenario is fast enough for your users and gets faster with each new language release, with no code changes.

Lets talk about the decades of research that went into autoregressive transformer models, instruction tuning, and RLHF, and then chat harnesses. Type to a model and get a response back, because behind the scenes your message is prefixed with “User: “, triggering latent capabilities in the model to hold its end of a conversation. Scale that up and call it a “low key research preview” and you have ChatGPT. Wildly simple idea, massive implications.

These abstractions take you further from the machine and yet despite that they were adopted en masse. You have to account for the ruthless competition out there - each one would’ve been eliminated if they hadn’t proven to be worth something.

You’ll never understand the whole machine so just work at the level you’re comfortable with and peer behind the curtain if and when you need (eg. when optimizing or debugging).

Or to take a moment to marvel.


This is nothing new; business gotta pay for itself after all.

But ads don’t have to ruin a great company.

A century or more ago, top tier journalistic institutions created norms of putting strong barriers between the reporting and advertising sides of the house. That kept trust with customers and made journalism a sustainable long term business.

So, It’s mostly Google that couldn’t keep its hands out of the cookie jar (not solely Google, but they’re an industry leader.) It really doesn’t have to go south, it’s not the default, but Google did set the tone for Silicon Valley in exactly the way wise journalism company leaders did for their industry in the late 1800s. If OpenAI has a long term view on this they’ll follow a journalism industry model instead of a cookie jar model - but they have to believe deep down that customer trust is worth more than ad dollars long term.

There are reasons to hope: OpenAI has more and fiercer competition than Google; including Chinese competitors that can’t be lobbied away. Qwen, DeepSeek, Mistral and Kimi all have free chat UIs!

I remain stubbornly optimistic.


Jony must have got bored of hanging in North Beach with Sam Altman ¯\_(ツ)_/¯

Huh?

The whole point of society is that you don’t need to know how the whole thing works. You just use it.

How does the water system maintain pressure so water actually comes out when you turn on the tap? That’s entirely the wrong question. You should be asking why you never needed to think about that until now, because that answer is way more mind-expanding and fascinating. Humans invented entire economic systems just so you don’t need to know everything, so you can wash your hands and go back to your work doing your thing in the giant machine. Maybe your job is to make software that tap-water engineers use everyday. Is it a crisis if they don’t understand everything about what you do? Not bloody likely - their heads are full of water engineering knowledge already.

It is not the end of the world to not know everything - it’s actually a miracle of modern society!


These types of failures are par for the course, until the tools get better. I accept having to undo the odd unruly edit as part of the cost of getting the value.

Much smaller issue when you have version control.


> The XNU kernel runs on a variety of platforms

This is fascinating, would love to know where it’s used! (Besides macOS)


I believe it means Apple's other hardware platforms (phones, tablets, smart TVs, VR headsets, smartwatches)

It's used in iOS as well. iOS runs in some unexpected places, like for example Studio Display. Also, the Apple Lightning Digital AV Adapter runs Darwin (because RTKit didn't exist yet).

And touchbars too, strangely enough.

For Intel platforms, the Touch Bar is driven by the trusted coprocessor (T1/T2), but that itself runs bridgeOS which indeed is Darwin/watchOS-based. With Apple Silicon I don't know if bridgeOS is still used; the SEP runs an L4.

that lightning AV adapter is crazy, iirc it creates an ethernet connection to airplay the display to the device

All of Apple's platforms.

Perhaps they mean ISAs

Well x86 at one point, arm both the 32 and 64 bit versions. I think they had RISCV support in their source tree at one point but not really at a commercial level. It does cover a lot different levels of hardware though

Does Apple use macOS in servers in its datacentres? Or are they all Linux?

Surely at a minimum they need macOS for CI.

Apple does have one advantage here-they can legally grant themselves permission to run macOS internally on non-Apple hardware, and I don’t believe doing so legally obliges them to extend the same allowance to their customers.

But that might give them a reason to keep x86_64 alive for internal use, since that platform (still) gives you more options for server-class hardware than ARM does


They do run Apple Silicon in data centers, so perhaps another custom version of Darwin + their system frameworks. It is hard to tell without some leaks :)

For Private Cloud Compute: “a new operating system: a hardened subset of the foundations of iOS and macOS tailored to support Large Language Model (LLM) inference workloads while presenting an extremely narrow attack surface.” https://security.apple.com/blog/private-cloud-compute/

I wonder if there is any chance we might see another Xserve?

If they’ve got Apple Silicon servers in their own data centres…


They use Ubuntu on x86-64 servers, at least for iCloud. Backends for iCloud, Photos and Backups etc. are written in Java.

Any sources or more information on that?


For the Java bit at least, this aligns with job descriptions I’ve seen and recruiter outreach I’ve received (long time ago though, maybe 5 years).

NeXT added a Java variant to WebObjects and it was for several years the main server side infrastructure, after being acquired by Apple.

Nowadays you can usually still find Java and JVM languages like Clojure (Apple Maps), on Apple's job ads.

How much of it is still Java based, no idea.

I imagine XCode Cloud has nothing to with it for example.


Unfortunately I am the source in this case. It is from having worked on them personally. :)

PPC32/64 of course, and for a long time Darwin still contained remnants of its predecessor's support for SPARC, PA-RISC, and m68k.

Which Apple products run arm32 XNU? Their first Apple Silicon CPUs were already arm64.

Well there were still the historical arm32 chips in their iOS devices, but until recently the watches were a cursed arm64_32 (or something like that) which is arm64 with 32 bit pointers iirc.

I should have just soureced this, They had PowerPC not RISCV in there source tree that was the X factor one. The Arm32 bit variant is closed sourced (leaked before) but was supported until IOS 11. XNU is really old almost 30 years! And before XNU there was the MACH kernel and the larger BSD tree it was built on which is an argument that it probably had a initial MIPs release too but I couldn't source the truth on that.

[1] https://en.wikipedia.org/wiki/XNU


Is mc68k or PPC still in there anywhere?

I'm sure there's vestiges of them somewhere, but the underlying support (the architecture specific parts of the mach portion of the kernel) is gone for those archs.

I wouldn't be surprised if they keep a minimal Power base maintained behind closed doors. It's how they managed to jump ship to intel so quickly, they never stopped maintaining NeXTSTEPs x86 port

I seriously doubt it.

Apple's ARM implementation is in a really good place right now. It would take something extremely compelling to get them to consider any other architecture for an application processor, especially considering that it'd mean giving up some degree of control.

Power is probably not where Apple would choose to go unless something really unusual happened. It's essentially just IBM's pet architecture at this point.


I would honestly be shocked if they were.

They've been making quite a few changes to the virtual memory code over the past decade, and keeping those vestigial arch's around is a pretty big maintenance burden. It'd probably be less work to just add the arch as if it were new when it's needed at this point since the kernel itself is pretty portable.


IIRC, Apple uses 'platform' to refer to an SoC integration. For example, M1, M2 and etc. are separate platforms. M5 in Vision Pro is a separate platform than M5 in MacBook Pro. I believe Apple's XNU does somewhat still support non-Apple Silicon as well though.

Yeah they're was that whole x86 thing thru did for quite a while.

Twice, on the basis that NEXT used the same kernel and that ran on 68k and Intel when Apple bought them and later ported it for Power PC. When Steve Jobs went back to Apple, for a long time he ran NEXT on a Thinkpad.

NeXTSTEP also ran on SPARC iirc

OpenSTEP actually.

OpenSTEP had SPARC support, yes, but NeXTSTEPs last release had support for m68k, x86, and SPARC. 3.3 had support for PA-RISC

Hey! Thanks for the thought provoking read.

It’s a limitation LLMs will have for some time. Being multi-turn with long range consequences the only way to truly learn and play “the game” is to experience significant amounts of it. Embody an adversarial lawyer, a software engineer trying to get projects through a giant org..

My suspicion is agents can’t play as equals until they start to act as full participants - very sci fi indeed..

Putting non-humans into the game can’t help but change it in new ways - people already decry slop and that’s only humans acting in subordination to agents. Full agents - with all the uncertainty about intentions - will turn skepticism up to 11.

“Who’s playing at what” is and always was a social phenomenon, much larger than any multi turn interaction, so adding non-human agents looks like today’s game, just intensified. There are ever-evolving ways to prove your intentions & human-ness and that will remain true. Those who don’t keep up will continue to risk getting tricked - for example by scammers using deepfakes. But the evolution will speed up and the protocols to become trustworthy get more complex..

Except in cultures where getting wasted is part of doing business. AI will have it tough there :)


Agreed, unfortunately.

Passwords are easy to understand, transparent and portable, and when used with good hygiene (always using password manager and generating unique & strong passwords for everything) there isn’t yet a strong case for anything else.


I’m not happy with everything about passkeys either. I am fine with them as an additional method, but I would never use them as the only method.

That said, I had a much easier time getting my kids onboard with a FIDO2 security key than I would have a password manager.

Enter your email and touch this is easy to understand.


One thing it enables is being the sole credential, as in, not needing usernames. It helps with faster logging in and is not prone to security issues caused by browser autofill.

This also means sites can allow you to sign up without collecting any more info than registering a passkey, but of course they want to siphon all that data.


Citation needed.. really sorry to say it because there are plenty of things to say about the current US administration.

It feels like people inventing this story, farming for followers on socials by manufacturing outrage. And a close read of the article will uncover that it was denied by the networks.

This needs a deeper dig before opinions be formed - especially given the vehement denials of manipulation by the broadcasters.

But until then, citation needed.


> The underspecification forces the model to guess the data model, edge cases, error behavior, security posture, performance tradeoffs in your program

It’s guessing using the entire sum total of its ingested knowledge, and, it’s reasonable to assume frontier labs are investing heavily into creating and purchasing synthetic & higher quality data.

Judgment is a byproduct of having seen a great many things, only some of which work, and being able to apply that in context.

For most purposes (granted not all) it won’t matter which of the many possible programs you get - as long as it’s usable and does the task, it’ll be fine.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: