Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"Of course, Asahi Lina and I are two individuals with minimal funding. It’s a little awkward that we beat the big corporation…"

Love the euphemism. This puts Apple to shame, plain and simple. They obviously don't care about standards, or compliance, because they like people to be walled in their own little private garden (still waiting for the facetime standard, or any kind of cross-platform technology created in the past 10 years).

If i weren't an iOS dev, i would have ran away from the apple ecosystem a long time ago. I love their hardware, and loved the brand back in the 80s and 90s when apple was about creativity, putting humans first before machine, etc. But what this company has become is just a corrupted mess of greed behind a curtain of politically correct marketing videos.



I agree with you completely and yet it's practically impossible finding a worthwhile alternative. I've tried finding alternatives on my last upgrade cycle and it's like having to live with endless amounts of compromises just to get away from Apple.

For the iPhone, I tried looking at the Pixel for the Vanilla Android experience and long support, yet it seems like people are fighting battery life issues all the time. Not to mention the polish of the software, app ecosystem and stability.

Also, couldn't find anything coming near the value of a baseline Macbook Air M1 as far as build quality, battery life, stability etc. is concerned.

I like to also read comics and magazines on the iPad and that's also a market where I have no idea what an alternative would be. And that's the state of the tablet market for years now.

Maybe I could get rid of my Apple Watch, but it just works and has endless amounts of third party accessories. I've been looking at Fossil Hybrid Smartwatches and it seems like they are a hot mess of instability and bad support.

At the end of the day, it's about stability and ease of use. I'm way past the time where I had the time and found it really cool to try every new ROM coming out ("daily driver", "What isn't working? You tell me") and it seems like Apple still can't be beat at this front. Sadly.


Probably the closest option to M1 on build quality and battery will be the AMD version of the Framework laptop... if you want can run Windows or Linux without issue. Will probably go that direction on my next purchase. Later this year and next year AMD are releasing new laptop CPU/APU that will absolutely kill on performance:watt, while they're a bit ahead of Intel at the moment, they're going to leap a bit further ahead if GPU perf matters to you. Not that Intel is asleep, just behind.

On the phone, I started with Android, so just kind of used to it... I've bought about every 2-3 generations for a while, currently on a Pixel 4a. I tend to avoid the high end, and find if you wait 3 months or so after a new release, the kinks are usually worked out by then.

As to the iPad, there really isn't a good alternative that I'm aware of... there are still a few Android options, none are great... the MS surface tablet and other convertable laptops are okay, but still not as nice a UX, it's not my thing so doesn't bother me, but can understand why if it works, it really works for you.

Watch is about on par, from what I understand.. again, not something I'm into personally.

I tend to take the Apple option for work (software dev, mostly web/svc oriented) only because corp Apple experience is generally better than corp windows.


I vote with my wallet.

The Framework laptop is my choice. It is not the best in every category, but it is mine.

I didn't like the weak hinges, I replaced them. I can replace the battery when I need. I can upgrade the memory or hard disk, or motherboard or screen when I need.

The thing with Apple is that their way of vertical tech integration results in highly polished, non-standard, unservicable machines. They are nice, but not worth the trade of ownership for me.


I feel exactly the same about Apple's software. It's great as long as your preferred workflow matches the Apple-blessed workflow. But if it doesn't, you're pretty much hosed.

Desktop Linux is not as nice or facile or polished, but it feels like "my" desktop because I can modify it. When I use MacOS, it feels like I'm just renting somebody else's computer. It's a very nice computer, but it can never be mine.


I mean, that’s your definition of “yours” though.

It’s like saying that buying a condo isn’t home ownership due to an HOA having some oversight. Some people are totally fine with that, it doesn’t mean it’s less “theirs”.


I would in fact argue that exact point.

How much you control something directly relates to how much you own it mentally and practically.


I regret to inform you that your point would not be a winning argument with a significant portion of the population.


In terms of "premium finish" feeling, I think that the Starbook from Star Labs might be in the convo - they actually make their own chassis, unlike so many other vendors.

https://us.starlabs.systems/pages/starbook

(I know Framework isn't rebranding stuff - just throwing another in the mix)

Good luck getting one though, wait times seem bad every time I look.


my starbook arrived about a year after I purchased it, and was DOA.


>As to the iPad, there really isn't a good alternative that I'm aware of...

As someone who refuses to use Samsung phones because of the bloatware (and was using a Pixel 4a until the sim slot crapped out last week, of all things) I've actually been pretty happy with Samsung's tablets.


I still don't understand why people think in 2023 that Android can not be a viable alternative to an iPhone. I would never use an iPhone for certain software and UX related issues (biggest is the missing back gesture/button). My Pixel does better photos than ANY iPhone on the market.

Macbook Air M1, just look at Dell XPS series.

Samsung S9 tablets have an OLED display. OLED... you don't get that on any iPad.

Apple Watch really has no competitor which matches 100%. On certain areas like fitness (Fitbit) or hiking (Garmin) there can be also some good and better alternatives but it does not match 100% of the features.

I think there is always a choice except when looking at the Apple Watch.


I had a top of the line XPS. Battery life is less than half that of the Air. The speakers in the XPS sound like they are from the 90s. It gets painfully hot on the bottom case. The keyboard wrist rest area is cheap plastic. It was thicker and heavier than the MBA. Intel Iris GPU (ie slow af) versus M1/M2 GPU.

One point to the XPS: the pixel density of the 4k 13" was absolutely LOVELY. I have never seen a screen so nice.

There is really no comparison overall, though: the Apple laptops blow them (and everything else in that category) out of the water.

The top end iPhones are similarly 2-3 years ahead of the flagship Pixel devices in build quality, too. I tried, really I did.


Same here. The XPS’s hard drive died within two years and I had to replace it.

The keyboard and touchpad also don’t hold a candle to MacBooks.


I bought a MacBook Pro in 2013 and my mom bought a Dell XPS at around the same time. Her laptop died, she got a Lenovo Yoga. That one died within a year and was replaced for free. The new one died after two years. When the M1 Air came out I gave her the old MacBook Pro, which she still uses every day, going strong (aside from battery life, but it's mostly plugged in).

It has essentially outlived 4 Windows laptops. I expect the M1 Air to still be relevant in a decade, as well.


I bought a mid-range Acer laptop in December 2012. I used that thing as my daily driver until 2018 until I finally got a desktop. In 2020 I repurposed that laptop as a server and it's been running ever since. For tens of thousands of hours.

My point is that Apple doesn't have a monopoly on quality. Perhaps I'm pretty lucky in this sense, but I have had great longevity out of all of my hardware, and I have owned very few Apple devices.


The M1 won't, the SSD will wear out before that and with no way to swap, it will be landfill as well.


In 2013, TechReport did an SSD endurance test: for months, it ran non-stop write operations, a much more strenuous use case than what a regular laptop would experience, which will be primarily read operations. 4 months in, after writing 300TB, all the tested devices were still working without issue.

If the characteristics of the SSD in an M1 are sufficiently similar to the SSDs that were used back then (I have no clue if that's the case), wear out will be a non-issue.


Since 2013 there have been multiple iterations of SSD technology that increased storage density but sacrificed storage durability: https://www.howtogeek.com/444787/multi-layer-ssds-what-are-s...

This is why people are concerned about SSD endurance. An old SLC SSD's NAND would be good for many, many writes, tho the controller would often fail. Nowadays the NAND fails, but the controller is fine.


Do the M1s have especially short write lifetimes? I have a Toshiba from 2010 or so and the SSD is still fine even if the trackpad and speaker ports died years ago.


After 2 years of use, my M1 MBP has 5% write life used. Extrapolate that out and you get 40 years of lifespan.


I'd note that the 2013 MBP also has an SSD that hasn't experienced any problems so far.


Been running a XPS 9560 since 2017 and its rock solid. Windows 10 with WSLv2. Basically the key to any Windows machine is finding the combination of drivers that are stable. That is the trade off with Windows. They support a ton of hardware so naturally the driver quality varies. Apple's problem space is easy in comparison. One set of hardware, one set of drivers. A lot of the complaints about XPS hardware really boil down to the bad set of drivers Dell ships them with. As I get older it has become pretty clear the sweet spot is to be two generations behind the latest and greatest. You get the ideal spot between price, cheap replacement parts and stability.

So not turn key, but also not a rip off prison like Apple :)


Isn't a good high density screen typically a large power drain? Perhaps there's a trade-off that was made there…


It's not included because it's really hard to make a high quality screen, not because of the battery tradeoff.


> Macbook Air M1, just look at Dell XPS series.

If you want an unusable trackpad, a middling keyboard, a fan that spins up and down at random when the laptop is just sitting there, a space heater for your backpack when you close the laptop and stow it away, and a pathetic battery life, then, yes, a Dell laptop is just what you need.

I'll admit, I have a more expensive Dell Precision laptop, so maybe the XPS is actually usable, but I'm not going to hold my breath. The one that I have is the worst POS laptop I've ever had the pleasure of being forced to use.


I had a personal XPS 13 back in 2015 or 2016 that I loved. Right now for work I have a Precision 5560 from 2021 which has all the drawbacks you listed and that I hate. I don't know if it's a year thing or a model thing, but certainly there is no brand consistency when it comes to Dell.


> I still don't understand why people think in 2023 that Android can not be a viable alternative to an iPhone.

I don't think most people -- even iPhone users -- hold that opinion. In the US, at least, the iPhone is still a status symbol. People have iPhones because they don't want their iMessage bubbles on others' phones to be the wrong color. They're locked into that ecosystem with various purchases and don't want to throw that away. They use a Mac and like the integration.

On top of that, I (as an Android user) am constantly uncomfortable running a mobile OS built by a company that exists mainly to track people's behavior and invade their privacy, with the goal of selling ads (and I am more vehemently anti-advertising than most people). As much as I don't fully buy "Apple's commitment to privacy", they are in a much better place in that regard than Android is. I lock my phone down and give nearly every app (including Google's) zero permissions, and only enable (and then immediately disable[0]) as necessary, but I'm still convinced my privacy posture would probably be better with an iPhone. But I don't want to live in that walled-garden nanny-state, so that's that.

[0] https://play.google.com/store/apps/details?id=com.samruston....


> I still don't understand why people think in 2023 that Android can not be a viable alternative to an iPhone.

Because Google... I could live with not having iMessage or AirPlay, that's annoying but something I could live with. So it's either a de-googled Android phone or iPhone, and I do need a few apps which are only available in the App Store or Play Store, so I figure I'm limited to phones that can run something like CalyxOS, which basically limits me to Pixel or FairPhone.

The FairPhone isn't a terrible choice, but I'm not going to replace a functional iPhone with it... if it break maybe, or I can get a used iPhone.

It's not that I trust Apple all that much, I just trust them way more than Google at this point. I don't think Google is evil or bad, but their interest and mine doesn't really align.


>I could live with not having iMessage

So, literally any messaging app, something that non-Apple users have to do anyways, and have to deal with your bullshit about only going through iMessage when they have to send you SMS.

>AirPlay

Chromecast is infinitely more ubiquitous. Also, if Apple didn't patent AirPlay and refuse to share it with anyone, you wouldn't be in this situation.

I will absolutely agree with Google being an absolutely dreadful steward of Android, but make no mistake: you gave Apple full support in locking themselves down in their own little playground, and now you're complaining you can't get out.


> Chromecast is infinitely more ubiquitous.

Chromecast is a device.

> Also, if Apple didn't patent AirPlay and refuse to share it with anyone, you wouldn't be in this situation.

Google Cast is just as proprietary as AirPlay. Both require licensing to be included in devices. I have an LG TV that supports both, an ancient Roku device that does the same, as well as supporting Miracast. I suspect you're confusing Chromecast and the Google Cast protocol with Miracast, an open standard; one dropped by Google in favour of their proprietary stack.


> Chromecast is a device

It's branded as "Chromecast built-in" when supported by a TV, not "Google Cast".

e.g. https://www.sony.com/image/89821bf64399cd4c34680e0988903e4b?...


Is that a recent rebranding? The SDK is still called Google Cast.

https://developers.google.com/cast.


It's probably a difference between developer facing and consumer facing, or software vs hardware branding.

Or maybe they just couldn't get a trademark for "Cast"?


My son bought a Dell XPS for exactly that reason. After 1.5 years, the battery life was at 50%. He called Dell support and they said it was normal. He's now in the market for a MacBook Air M2.


I can’t believe my M1 Mac is almost 3 years old, performs just like the day I bought it (blazingly fast).

My work Thinkpad from the same period feels half way dead.


That's likely more a reflection of the software you run (and update over time) than of the hardware itself, no?


It helps that the Apple Silicon CPUs run so cool. In other laptops your CPU cooler and fans will fill up with dust causing the CPU to thermal throttle.


This is a limit of battery technology. Your Apple laptop will ha e shit battery life in a few years as well


I have an Apple laptop that is a few years old. It does not have shit battery life. Probably 90% of new.

Did you research this statement before making it?


I've owned several Apple laptops over the last 20 years for both personal and work use and they've all had reduced battery life over time. I've suffered one recall, another one which failed 1 month past the warranty and which the Apple Store said was quite normal(!), and several had degraded to the point I had to ensure they didn't drop down to below 20% (or some other magic number). After the first few I took into the Apple Store only for them to say "oh that's to be expected" I stopped going in. Not sure why people think Apple laptops are magically immune, they're not. They suffer battery issues just the same as other brands do.


Apple introduced battery charging limitation feature on MacBook in 2019 (or a few years ago?), while Windows laptops supported it in 2000s. That should be the reason for previous experience. https://support.apple.com/en-us/HT211094


Yep. Apple battery health in system preferences will show you how much the battery has degraded since the device was new. My ~2yo M1 MacBook Pro still has 92% capacity compared to when it was brand new.


I don't know if it's in System Preferences, but for me, System Information shows my mid-2015 MBP (running macOS 10.14.6 Mojave) still has a battery capacity of 8266mAh and a remaining charge of 8079mAh at 100%. Compared to the advertised capacity at launch of 8755mAh, that's a charge capacity of ~92% after ~five years (AppleCare replaced my battery during their recall). MacBooks are just built different.


Modern Macbooks are very strategic about when and how much they charge. You can override it if you know you need a full charge Right Now, but otherwise they will decide how much to charge based on your usage patterns, which keeps the battery alive much longer than on many other brands of laptop.


Apple's batteries are covered under warranty up to 1000 cycles. I had them replace my battery that hit 80% capacity at 3 years right before AppleCare ran out.


Device manufacturers can engineer for longer useful lifespan by oversizing the battery, can't they? Do they all do that to the same extent?


To an extent, there's a limit for air travel, generally speaking as they can be dangerous. The m1/m2 are just killer in terms of lifetime and usage for general reading/browsing/email.. and still very long for even content viewing. Most people aren't rapidly draining their batteries, so the longevity gets to be a bit better overall.

AMD is getting pretty close and the perf:watt on the coming generation(s) for laptops (including integrated gpu) look to be really impressive to say the least...


They can also implement better battery management technology (cooling, charge rate curves, keeping the charge between 10%-90% instead of 0-100%, but reporting 0-100% to the user via scaling), etc, etc.

For a good counter-example, look at the early Nissan Leafs. They burned out their batteries in a matter of a few years, but battery replacements for other brands from that time are basically unheard of. (The inherent information asymmetry for new car purchasers is one reason Biden's IRA dictated minimum car battery warranties.)


Funny, I heard the total opposite about Nissan Leafs. The industry was guesstimating that batteries would last 8-10 years. The first Nissan Leafs (which was about the first commercially mass-available EV) had battery lives where something like 90% were still going strong and still 80% of original capacity left after 13 years.

Rather than the Leaf being problematic, it was the car that showed the market that worrying about the lifespan of EV batteries wasn't really necessary.


I had one of those early Leafs, the battery degradation was real. Newer vehicles seem to have much better curves.


The key to battery life/health on the XPS is to use the BIOS functions to limit charging. My XPS-13 9370 has been plugged in most of its life (about 4 years now) and battery health has dropped from 96% to 93%.

I can't speak to the rest of the comparison to the Macs - they're probably better overall - but the battery life is a solved problem if you know to limit charging.


That's clever, but probably too clever for the 99% of people who don't even know what a BIOS is. You've got to wonder why Dell wouldn't do that kind of front end work themselves.


When most people you know use iMessage then Android is a bad experience.

Also, as someone who switched from Pixels and other Android devices to the Apple ecosystem . It’s nice that everything “just works.”

It’s kind of like running BSD or Debian stable after having been on Fedora/Arch/etc.


> When most people you know use iMessage then Android is a bad experience.

Isn't this because Android uses open standards for its SMS and iOS refuses to do so?


RCS as a baseline standard is proprietary, but google then slapped a bunch of proprietary extensions onto it that it refuses to license, so no, it's not.

https://arstechnica.com/gadgets/2022/08/new-google-site-begs...

> Google's version of RCS—the one promoted on the website with Google-exclusive features like optional encryption—is definitely proprietary, by the way. If this is supposed to be a standard, there's no way for a third-party to use Google's RCS APIs right now. Some messaging apps, like Beeper, have asked Google about integrating RCS and were told there's no public RCS API and no plans to build one. Google has an RCS API already, but only Samsung is allowed to use it because Samsung signed some kind of partnership deal.

> If you want to implement RCS, you'll need to run the messages through some kind of service, and who provides that server? It will probably be Google. Google bought Jibe, the leading RCS server provider, in 2015. Today it has a whole sales pitch about how Google Jibe can "help carriers quickly scale RCS services, iterate in short cycles, and benefit from improvements immediately." So the pitch for Apple to adopt RCS isn't just this public-good nonsense about making texts with Android users better; it's also about running Apple's messages through Google servers. Google profits in both server fees and data acquisition.

Like c'mon google doesn't care about open-standards except insofar as that allows them to embrace-extend-extinguish. google's end goal is imessage but with google servers in the middle instead of apple ones.


> there's no way for a third-party to use Google's RCS APIs right now

To be fair, this criticism is fundamentally true of iMessage, too. Implementing all of iMessage's features in an open, trustless manner is impossible.


Thanks, glad I asked - I genuinely was not sure.


Sorry, it sounded snarky/pointed, that was more adversarial than was ideal. It really is hard to ask a question these days, if it's political...


This is correct. Google put a lot of effort into making carriers adopt RCS (which has most of the functionality of iMessage), but Apple will not adopt it to keep their "competitive advantage."

https://www.android.com/get-the-message/

I use an iPhone now but these kinds of business tactics and the others mentioned here really make me wish there were more competitive products on the other end.


RCS is pretty garbage without Google’s extensions. Of course this ends up being very similar to iMessage.


Does it matter for me as a user?


No, and I wouldn't want to imply otherwise. I was genuinely asking because my recollection is that the answer is "yes" but I don't recall.


I loved Android back when I had the time to hack around with ROMs and crazy customization. It's fun. But these days in my busy working life, I don't have time for that kind of stuff, I just want something that works well and gets regular security updates. My Android phones weren't really cutting it.


> I would never use an iPhone for certain software and UX related issues (biggest is the missing back gesture/button).

During my time as an Android user, that back button struck me as the single biggest anti-feature in the Android UX. Every application implemented it differently, sometimes I'd find it bumped me right out of an app if I tapped it once too often, others didn't do that. I hated the thing; it required learning a different set of mysterious tendencies for every app and situation. So happy that it's not stinking up the screen of my first-ever iPhone.


Stuff like the 911 calling bug is a great example: https://news.ycombinator.com/item?id=32713375


> I still don't understand why people think in 2023 that Android can not be a viable alternative to an iPhone.

Because some people, like yours truly, enjoy having a patched up-to-date mobile os, but also don't need to change their phone every other year. My Iphone 7 which I bought refurbished in February 2017, still works perfectly after a battery change. It has the previous iOS version, but it keeps receiving security updates. All the apps I need work on it (games may be too much for it, but luckily, I have a PC with a big-ass GPU for that). My dad's Galaxy S7 hasn't had an update in a while. He tried to install 1password, a freaking password manager which is basically a glorified notepad, says it doesn't support the phone and / or the android version. His GS7 is working fine otherwise, though.

> Macbook Air M1, just look at Dell XPS series.

This has to be a joke. I can wholly understand people not valuing build quality and preferring to save money over that or invest it someplace else. But that doesn't make it "comparable".

Have they finally fixed the touchpad moving by itself or ignoring your finger? The fan spinning like a jet engine for no reason? I hear nowadays everybody's on the "modern standby" bandwagon. How do you like your battery draining 50% while on your commute home while the PC supposedly sleeps? Or waiting around for it to wake up from hibernation? That's if you're lucky enough it doesn't burn down your bag because it figured it's as good a time as any to wake up and do who knows what, which absolutely couldn't wait.

I won't comment on the ipad nor the watch, since I've never owned any of those.


FWIW, Apple does release _some_ patches for previous iOS versions, but there absolutely are mitigations that are not back ported to older versions.

If you care deeply about security, I’d recommend having a phone that actually is using up-to-date OS.


I do care, that's why I'm thinking of getting a new one. But still, ios 16, the first which didn't support the iphone 7, came out in September 2022. The iphone 7 came out in September 2016. I'm not aware of any Android phone with such a long support cycle.


There is not even one that would have half that.


My android phone is from 2011. All the apps I want to run on it run on it. I know of some that won't, but I don't want to use them.

So what does this prove? Anything? Probably not.


I'd argue that games are a specific kind of app which require performance. It's likely that a brand new low-end phone supports the latest android os but not the latest power-hungy games. Though you can probably install them on it, just like I can install power-hungry games on my old iphone. It's just that it won't be an enjoyable experience playing them.

A password manager doesn't have any such requirements.


Agreed Android is a suitable iOS replacement. Maybe a bit wonky at times but I’ve used both and they’re the same for my needs.

XPS is not comparable to M1. Not even close.


> I would never use an iPhone for certain software and UX related issues (biggest is the missing back gesture/button)

Um, swiping from the left edge takes you back on iPhone and iPad. Not sure how long ago did you last test the "missing back gesture" theory. Been using an iPhone for 3 years and has always been this way.


Not in several google apps — they deliberately break the convention.


just google ruining the experience as usual. same with youtube on iOS, they get away with it so why not try breaking the rules everywhere.


Android phones come loaded with ad tech, malware, depending on a matrix of carrier _and_ manufacturer. Even if they don't start with it, if they actually get updates, any update could bring it along later. It's a metaphoric minefield, where you have to do significant research before finding the current mine-free path. Samsung in particular has a reputation for lacking scruples.

It has nothing to do with hardware.


”…missing back gesture/button”

Swipe from the left of the screen goes back a page in the current app. Swiping right on the bar at the bottom goes back to the previous app used. Two different locations but the same gesture. What else is missing for a back gesture/button?


Back gesture on iOS only works within an app. Back button on Android works system-wide (cross-app).


Swiping along the bottom of the screen goes to the last used app.


It works but I prefer Android's intent based transition system.


What does intent based transition system mean? The advantage of Apple’s system is that it is unambiguous. I’ve heard from plenty of people that the “back” button or gesture surprises them from time to time.


Macbook Air M1 > Dell XPS or whatever alternative

can't speak to other tablets but iPads get the job done. I have one of the oldest generation ipad Air that still gets updates. Got it secondhand. It does the basics well enough. I'm not sure an android tablet would even be getting that support.


If you care about privacy, then the only choice within the Android-sphere is GrapheneOS, which limits you to pixels, which arguably are not on the same level in hardware than iphones.


>I would never use an iPhone... missing back gesture/button

One finger swipe right.


I have a Pixel 3a and longevity of phone is important. But Android is cutting down their supported cycles to 3 years. Unless something changes, in fall next year I will get an iPhone 14/15 (possibly refurbished) and keep it for 6/7 years. ios16 is still possible on the iPhone8 - a solid 6+ years of official updates!

Walled garden or not, their products are solid and supported. That is what most consumers look at. No hassle ownership for most.


> I tried looking at the Pixel for the Vanilla Android experience and long support, yet it seems like people are fighting battery life issues all the time.

I have the opposite experience. A few months ago a friend of mine had to buy a new iphone because their phone couldn't hold a charge. Shortly after they received their new phone, towards the evening, they remarked how happy they were that their new iphone (literally a few days old) still had 36% charge after most of a day's usage. I looked at my 2.5 year old (at the time) pixel 4a and I still had 84% charge.

> having to live with endless amounts of compromises just to get away from Apple.

Having basic system functionality, such as GPU accelerated OpenGL, Vulkan, or OpenGL ES, seems like a catastrophic compromise. Like I can compromise about how the widgets of application foo and the widgets of application bar don't match each other, I couldn't care less. But no Vulkan support? Forgetaboutit.


> Also, couldn't find anything coming near the value of a baseline Macbook Air M1 as far as build quality, battery life, stability etc. is concerned.

I have a Surface Laptop 4 with an AMD processor and 16GB of RAM. I'm extremely happy with it, though I'll concede that the build quality is a click lower than a M1 MacBook Air (two of the rubber feet have fallen off of mine, being my main ding against it) and nobody can touch Apple Silicon's battery life. But aside from that, it's definitely in the same arena for build quality as an M1 MacBook Air (my wife's daily driver, so I'm not just saying that out of ignorance) with, IMHO, a better keyboard and a touch screen if you're into that sort of thing. Oh, and the facial recognition unlock is the best. I've had zero issues with stability—Microsoft pays more attention to squashing Windows bugs on their own hardware, it seems like.

Surface Laptop 5 has been out for about a year, but it was a totally microscopic incremental refresh. That means the Laptop 4's have dropped in price a lot, even though it's still 95% the same laptop.

My exact model of Laptop 4 can be had on periodic sale, refurbished on Woot.com for ~$700 or less, which is a screaming good deal IMHO for what you get. You could almost get two of these things for the price of one new M1 MacBook Air once you factor in the (IMHO mandatory) RAM upgrade.


A refurbished Macbook Air M1 with 8GB of RAM (which works wonderfully for day to day stuff, even development with VS Code) is $849 directly from Apple. I never felt any slowdowns. The thing is chugging along nicely and I don't feel a need for an alternative any more. Apple Silicone has turned the industry upside down IMHO.


Does anyone have experience how IntelliJ runs on this machine?


Yes, none of the Jetbrains IDE's seem to be usable on a 8GB Mac. It invariably starts throwing errors about running out of memory after a while and then everything starts to lag (including typing).

Also you can't really run any other process alongside as long as it uses above ~5000-1000mb (so no Node for instance). Having a browser with more than a few tabs open alongside is also an issue.

Maybe a ~5+ year old version might work better.


Thank you very much.


I have an AMD 5700U linux notebook for work. I'll second that they're really quite great on battery and speed.


> Oh, and the facial recognition unlock is the best.

This is scary as fuck. Combined with all the telemetry and tracking M$ is now doing without care from within its operating system.

Windows and Android are subsidized data collection applications which run on subsidized hardware. Simple as.


Across every consumer-facing device where it has ever been deployed—which is probably in the billions at this point thanks to iPhone alone—consumer-grade biometric face unlock technology has a body count of precisely zero.

If I turn out to be the world's first facial recognition casualty, that would be SUCH a hilariously unlikely way to go that I almost couldn't even be mad about it.


> Windows [...] are subsidized data collection applications which run on subsidized hardware. Simple as.

How does Microsoft monetize (or even could)? Are you just saying random things?

Also you could just install Linux, of course battery life would likely be even worse then.


LOL, How would you unlock iPhone?


There's nothing valuable about your Windows telemetry. It just means if you hit a crash it's possible it'll get fixed.


Apple has enjoyed, and seems set to enjoy for awhile longer, a full process node advantage over everyone.

People whiff that (cough RDNA3), and people overcome it kinda (Raptor Lake), but ceteris paribus, their shit is just from the future.

Having a bone to pick with Apple is a very reasonable thing, I’ve got a few gripes myself, but that’s why you buy something 2-3 years behind the cutting edge. It’s not because it’s better.


Based on reading Taleb, I have a theory and love to hear counterarguments if there are any:

Products are as good as the worst part. Apple‘s complete integration allows them to more easily fix the worst part since they have control over almost everything. Historically, "open" systems, as Bill Gates would call Windows together with the suppliers such as Intel and Dell, could get away with some bad aspects as long as they just threw in new CPUs with smaller transistors. Now that Moore‘s law and Dennard scaling have slowed down, fixing the worst part is the only way to find good improvements.


> Also, couldn't find anything coming near the value of a baseline Macbook Air M1 as far as build quality, battery life, stability etc. is concerned.

The ThinkPads are pretty good laptops, especially if you put Linux on them.


Use to be 100% linux person. I've gone through many thinkpads. M1 Air is so far beyond them it's not even close.

Battery life is (not exaggerating) 5x or more in my experience.

It's honest to god, much faster.

Completely silent and cool. (one of my thinkpads almost burned me it got so hot, and the fans get so loud)

I've never had a single crash, random restart, failure to sleep, failure to charge, driver problem, touchpad randomly not working, wifi failing, all of which I've had with thinkpads.


A 5x battery life improvement seems like the baseline had some issues, unless your MacBook goes like a week without charging.


My M2 MBA routinely goes past 10-16 hours of life time with simple coding stuff (phpstorm and PHP in Docker), and it stays comfortably cold all time. Most Windows laptops struggle to get more than 3-4 hours and will fry off your balls if used as actual laptops.


The poster was apparently a 100% Linux person previously, though. I’m pretty sure Windows was designed to heat up like that as a funny joke. For example on Reddit I see somebody complain that they only get 5 hours with my laptop model (zenbook flip 13)

https://www.reddit.com/r/ASUS/comments/ry0nwb/the_asus_flip_...

But it is pretty easy to get 14 in Linux I think, at least if you believe the battery indicator.


To be fair, Windows has better battery optimisations for new laptops. Many hardware supported sleep modes are still missing from Linux kernel, for example, when I checked last time.

Huge differences come probably from the lack of user skills, less about the OS. Or just broken drivers.


Isn't battery life generally much worse on Linux compared to Windows?


It is pretty configurable. I don’t see why it should be worse (assuming of course you don’t have driver problems).

And the configuration is pretty helpful, for example I have an OLED screen, so I can get some power savings from making things mostly black.

Plus the hard drive can mostly be idle; you don’t have Cortana or whatever they call it now poking around for interesting bits.


> I’m pretty sure Windows was designed to heat up like that as a funny joke

Nah, it's common across all x86 devices. Even Apple's old lineup... which is why they went for M in the first place, Intel couldn't be arsed to deliver something power efficient.


I guess I find this troubling because it would seem to indicate that I spend multiple hours a day typing at a dead laptop, hallucinating that it is still working.


Most recent Thinkpad was X1 Carbon with Kubuntu, running intellij / browsers / docker would last around 3 hours or so. M1 Air is 15+.

I also have a Anker 737 battery, with it I can double the macbook's battery if fully charged. The Thinkpad would only charge partially, so wouldn't even double.


That’s weird, I wonder which program did it.

I typically get a day of work out of my Zenbook flip 13; I haven’t really measured the battery performance rigorously because it is easily long enough that I don’t think about it (the battery indicator will say 14 hours, but those are of course pretty flaky). I’m a vim/Firefox with ads blocked guy though so I guess I must not be making it work very hard.


>driver problem, touchpad randomly not working, wifi failing, all of which I've had with thinkpads.

Those are Linux driver issues, not HW issues.


Incompatibility is a two-sided issue, there are enough laptops out there that work perfectly fine in Linux. But a brand isn’t a technical promise, they’ll miss with some models and hit with others.

People always complain about Lenovo in these threads, I think because they are held up as the “good non-Apple laptop” brand for whatever reason. I suspect this reputation makes people assume they can just grab any random model and it will work perfectly. That’s just a roll of the dice, maybe weighted in favor of working, but still random.


>But a brand isn’t a technical promise, they’ll miss with some models and hit with others.

Sure, that's why there's now a dozen Laptop brands that ship with Linux compatibility in mind.


> Those are Linux issues, not HW issues.

The touchpads on my wife's T470 and my T480 have very similar intermittent issues, despite her running Windows and me running Linux.


That only matters in a technical sense -- as a consumer, I want a flawless out-of-the-box experience. I don't care if X isn't working because of Y or Z; I only care that it isn't working.


They are very likely hardware issues that the Linux driver is simply not working around rather than being actually wrong.


And when I can’t join a call because the Wi-Fi has stopped working I’m sure everyone will appreciate that distinction


Did Lenovo sell you the laptop with Linux compatibility explicitly stated?


If you're using Linux and there are no good, reliable drivers for your machine, it might as well be a hardware issue.


As a user, I don't really care about the root cause..


If you're too lazy to make a Linux driver for your hardware, it's a hardware issue.


Maybe some HW companies don't have the budget to write drivers for the 3% userbase that is PC Linux users, especially since most commodity HW is in a constant race to the bottom in terms of pricing so profits are slim as it is.

So better check for the HW you're buying it, if it's compatible with the SW you intend to use, especially now that there's almost a dozen laptop brands selling Linux-ready laptops. Like you don't buy an X-BOX hoping it will run your Nintendo games collection and the blame Mycroft when you realize it doesn't work, do you?

And calling those driver devs "lazy" is a huge slap in the face, especially if you knew how overworked and underpaid people in that industry tend to be, as the profits are also very small. Not everyone is rolling in cash like Nvidia, AMD and Intel.

This sub can be quite pretentious at times.


> The ThinkPads are pretty good laptops, especially if you put Linux on them

I used ThinkPads for 7 years until getting new M2 Pro, mostly with Linux.

Touchpads on Thinkpads are not getting even close to Macbooks. You need external mouse.

Also the the basic screen quality on current Macbook Pros is beyond their top quality products. Try to look for 1000 nits screen? Not even gaming laptops have quality ones.

And battery life...

And performance...

When you have enough performance on your machine, the physical touch, screen and overall stability goes beyond everything else. Thinkpads have better durability on keyboard tho.

I also used to have one of those OLED Thinkpads and that was the biggiest mess I ever hard. They even cancelled OLED screens on all products for 4 years after that. The screen just broke every one month.


The nipple on the ThinkPad is unsurpassed. It's basically the reason I can't migrate away from them.


I have tried to use nipple many times but maybe I was unfortunate with it. I was not able to get good enough drivers on Linux for it, and as a result it was never accurate, but very clunky instead.


This was on Linux. It takes a bit of getting used to, but I won't go back unless I'm absolutely forced to.


I agree that I liked it on Windows. But I was not able to get the same experience on Linux.


My brand new work thinkpad is total garbage compared to my two year old MacBook Pro. Only laptop where I need to use an external mouse.


Yes, and the fan blows directly onto your mouse hand, making mouse usage awkward as well.


Actually this is my favourite feature of my work thinkpad. Especially during winter. Really. Besides tons of disadvantages of the platform for the price.


Agreed. I got a gen10 thinkpad (with Linux). The power supply makes a crackling noise. After 6 months the fan starting making grinding. The screen flickers when the CPU is loaded, and once a week the whole thing just locks up. Worst laptop I've owned.


Did work install any crapware on the laptop? Or is it bad even without?


The software is fine. The computer isn’t slow at all. It’s just not a well designed computer.


I've used Linux+Thinkpads for years.

W-series, P-series, top of the line

Switched to Mac M1 this year, and....longer battery life, better performance, higher resolution, brighter screen...it's not even close.


I have an M1 Pro 14 and a work-issued P14s, which is awful. Creaky plastic, the worst trackpad I've ever used, has a terrible display, spongey keys and runs unfathomably hot. All. The. Time. Every time I see someone recommending Lenovo, I cringe. It is night and day when compared to the MacBook Pro.


Weirdly I have both of these as well and feel total opposite. Give me the Thinkpad keyboard any day of the week. The mac keyboard feels down right anemic.

I wish mac would stop making the track pad so damn big though, the amount of palm activations I have on that thing drive me bonkers.


I'm intrigued by this. I'm typing this one-handed on the MacBook with my other hand resting on the trackpad without interference of any kind. The hinged trackpad on the P14s takes between 5-10 minutes to be useable from a cold start (these happen often due to the combination of an anaemic battery and power-hungry Intel processor); it's as though it needs to warm up. It is a pile of overpriced junk not worth the value of the parts it's made with. I've had other ThinkPads and generally disliked them - even the X series, but that boiled down to personal taste - not a fan of the aesthetic - and a crap trackpad. This P14s, though, is unmitigated shite.

My experience of the P14s says that either I have a dud (other colleagues complain vociferously about them, too), your MacBook is defective, or both. None of which are ideal!


I have zero love for the trackpad+keyboard.

But 80% of my usage with external keyboard and mouse.


I've found Thinkpads to be trash for the past few years. Could be bad batches, but I sent back my X390 three times for warranty repairs and its replacement T14gen3 once so far.


Tachiyomi makes Android significantly better for comics/manga than iOS.

It's one of the reasons I sold my iPad for a galaxy tab s7+. Every alternative I tried on iOS was just shit in comparison, paperback especially.


> yet it seems like people are fighting battery life issues all the time

I miss the days when I charged my cell phone twice a week.

It didn't take 100 mexapixel photos or display 4K cat GIFs, so I understand why.

But I miss that battery life nonetheless.


You can still buy basic phones that last all week, they're very cheap.

But nobody wants a phone, they want a tiny portable computer.


> But nobody wants a phone, they want a tiny portable computer.

Yes, including me. I want everything :/


A modern smartphone will last for literally weeks if you turn off all data and only use it for calls, the batteries are massive to support streaming video and playing games.


It's largely the background data-hungry apps and the obsessive screen time with foreground apps. As someone who almost uses their phone like a dumb phone, I charge my Pixel 5a about once every 7 days now, when it gets to around 20% remaining. It was more like 10+ days when I first got it, but that was also before I decided that avoiding the 0-20% range might be better for long-term battery health.

The most power-hungry apps I run are Slack, the built-in GMail and Messages clients, the Garmin app to periodically sync with my watch, and sometimes Firefox if I actually spend time browsing there instead of on my preferred laptop environment. The camera app also eats power when actually taking pictures or video, but I guess I rarely do.

When I first start using a phone, I go through a little effort to disable things I don't want via the system apps menu. For me, that includes the Google Assistant and their native launcher, because the last thing I actually want is to trigger search functions willy-nilly. If I want to search, I'll open Firefox and search...


I have found that my newest phone (Samsung S23) has significantly better battery life than my old Pixel 3a ever did. I really have to try hard to run the battery down and with my normal usage it easily lasts two days.


I only charge my iPhone 12 about every 2 days as well - at least with normal usage. But I can also easily run it down with an audiobook playing all afternoon over Bluetooth. No idea why that uses so much power.

When I’m at home I’m so close to charge points all day that I don’t worry about it. And when I’m travelling, I bring an external usb-c battery pack that can fully charge my phone about 6 more times or give my laptop another few hours of use.

Sure, more battery life would be strictly better. But I’m happy with the state of things right now.

Apparently the EU has legislated toolless user replaceable phone batteries by 2027. I’m curious if apple plays ball and if so, what iPhones will look like in a few years.


> Apparently the EU has legislated toolless user replaceable phone batteries by 2027.

The legislation calls for phone owners to be able to remove batteries “with the use of commercially available tools and without requiring the use of specialised tools, unless they are provided free of charge, or proprietary tools, thermal energy, or solvents to disassemble it.”

Essentially, no glueing the screen to the battery, which is sensible. I take umbrage with the "without requiring the use of specialised tools, unless they are provided free of charge". Manufacturers should make available specialised tools commercially and be allowed to request deposits for specialised tools by individuals looking to perform one-off repairs to ensure return. Otherwise, it's reasonable. I doubt we'll see a significant change in the form factor.


I charge my Pixel 5 pretty infrequently, couldn't give a number though. I leave it in battery saver mode automatically when it's below 70%, and often I will just plug it in to a quick charging cable (the one from my steam deck actually) for 20 minutes to "top up" (may not fully charge but who cares). I only notice my battery life about once every other week. Other than frequent photography, I guess I don't do very much intensive processing on my phone though.


I don't get these battery issues people have. I use a 3 year old Huawei P30 Pro, and I still get 2 days of battery life when charged to full. Apparently I average around 4 hours of screen time per day.


I recently bought a Sony Xperia 10 IV and the battery easily lasts 4 days. Performance is great too, at least for my usage (I don't play games on the phone though).


> I've tried finding alternatives on my last upgrade cycle and it's like having to live with endless amounts of compromises just to get away from Apple.

Everything is a compromise.

A while ago, I wanted a new notebook:

I looked very hard at a 16" M1 Pro with 64 GB of RAM, at approximately USD 4000, tiniest storage possible. I really really wanted to run this with Asahi Linux.

I purchased a Dell Inspiron 7610 with 16" display (and known design touchpad design defects), 3K resolution, sort of light-weight, Tiger Lake 11800H CPU, Intel + Nvidia hybrid graphics, now running at 64 GB of DDR4 RAM, 2 TB of very fast PCIe 4 SSD, 1 TB of PCIe 3 SSD.

Professionally I run this with Windows 11 Professional + VMware Workstation -> Fedora 38 because of an Azure VPN; in my private life this is plain Fedora 38 dual-booted.

Why? USD 1700. Less than half the cost.

Native podman / docker. CUDA. The (still) dominant architecture (x86). I am typing on Proper Keyboards anyway.


>I tried looking at the Pixel for the Vanilla Android experience and long support, yet it seems like people are fighting battery life issues all the time.

my pixel has good battery life.....I know it's anecdotal but it's been fine.

I've always been an android user on smartphones. Was samsung in the past but now I have a pixel. I like the pixel and vanilla android experience overall. Android allows some freedom to customize, I like their call screening, and the wide choice of apps. Pixels are also starting to be supported for longer; maybe not as long as iphones but it's always been an android issue and they're finally starting to address it. Stability is there imo. I have all the apps I need to do what I do everyday but obviously everyone is different.

I own a variety of different hardware including apple products, but I'm not 100% reliant on their ecosystem, so I'm keeping the pixel for now.

>Also, couldn't find anything coming near the value of a baseline Macbook Air M1 as far as build quality, battery life, stability etc. is concerned.

Agreed


My strategy is to go all-in on Apple as my daily drivers, and then to tinker with Linux in my spare time. For example, my main phone is an iPhone, but I recently bought a used OnePlus 6 with the intention of installing NixOS on it. My laptop is an M1 MacBook Air, but I code in Linux VMs, and have a Linux NAS at home.

As many of my services as possible are self-hosted, and as many of my apps as possible are FOSS, but I access them with Apple hardware. And as my M-series Macs age, they'll become Asahi machines.

This is the best compromise I've found. The truth is, Apple stuff still does Just Work, and having an actual Unix base is nice too. Open standards and protocols can get you a long way away from the walled garden aspect of Apple to the point where really they're just nice computers that don't really infringe on my life in any way.


On the app development side of mobile, I find Android so much more messy to develop for. It has its perks like the Jetbrains IDE and ability to use bleeding edge libraries (Jetpack), but those don’t counteract the downsides like needing a laundry list of third party libraries to do practically anything, there being no well-supported vendor-preferred “happy path” for various things, Java ecosystem baggage, fighting Proguard, etc.

And that doesn’t even get into the “fun” of there being differences between the versions of Android shipped by different vendors significant enough that maker and model-specific bugs and behavior inconsistencies are a concern, which is only a thing because of manufacturer insistence on deep customization (compare to Windows where if it runs on fine your PC, it probably does for 99%+ of other PCs too).


Apple is an ecosystem much more than windows is. Devices work together and enhance each other. So jumping to Linux isn't as easy as Everything has to change.

It's more like emigration or a divorce it will hurt and it's a big change. For most it's not worth the bother.


I've found that ChromeOS + Pixel 6 is fine. I have a Fitbit Sense 2 for a watch (I hate the idea of my watch telling me I have a meeting coming up lol).

ChromeOS is great. The host OS "just works" and I have a VM/container env to do development in. It feels secure, operationally straightforward, and low maintenance.

I don't really care a ton about my phone. It works, battery is fine. It makes phone calls, plays music. idk. I care way more about my laptop.


> Also, couldn't find anything coming near the value of a baseline Macbook Air M1 as far as build quality, battery life, stability etc. is concerned.

And the trackpad! I always evangelize the Macbook trackpad because that shit is bananas, as it were.


I used to think that, but my latest Asus laptop has a trackpad just as good as the macbook (which I use professionally). Actually, I like PC trackpads more, just because they tend to be reasonably sized rather than the monster trackpads on most macbooks (although, fortunately, they seemed to not be quite as big in the M1/M2 iterations!).


I'm sorry, but I just don't buy that it's as good as the MacBook. Can you detail what model Asus laptop you're using? I'm legitimately curious what the hardware is.

The only trackpads I've found that feel close to Apple's on a hardware level are the ones produced by Sensel (https://sensel.com), and those aren't in every laptop. Then you have the driver situaton on top of it, and Apple's tight vertical integration just always seems to give it the edge - I literally never experience invalid taps or movement, etc.


https://rog.asus.com/us/laptops/rog-strix/2021-rog-strix-sca...

Not only does it have a great trackpad (with an ability to become a numberpad), it also has an optomechanical keyboard, with "blue key" like feel. Plus a 3080 in it for games. It also has great cooling, so the fan only is on if you're hitting the 3080 hard.

It is certainly the most fun laptop I've owned (it has led backlights for the keyboard, too)! Granted, it runs windows, but win11 is pretty nice, honestly. No idea how amenable it is to putting linux on.


Sometimes I think the biggest secret in this industry is that 2 people can outproduce even the largest companies. The only thing keeping everyone's job secure is that nobody can accurately identify which 2 people.


Isn't that often a case of "What one programmer can do in one month, two programmers can do in two months."?

Companies with all their bureaucracy are often slow, but that bureaucracy also isn't entirely superfluous, and frankly I don't know if any of this was on any given company's list of things to do. Not sure anyone beat anyone to the punch here.


Yes, it is. Such a case. A team should be dependent on what needs to be implemented.

For “boring” and “well-trot” stuff that people can just do on autopilot, sure, that scales.

For projects with zero obvious paths, an experienced and lean team is necessary.

I have wasted too much time for superficial CRs, and hand-holding people who didn’t provide value.


Yeah, that's definitely not true. You'll know when you interview them and they have extensive contributions to open source, an active GitHub or GitLab account, can show you things they're proud of having made in their spare time, can talk at length about technologies and implementation details none of your other candidates can, can show you their participation in mailing lists, etc.

There are a billion obvious signals compared to the people who clock in at 8 AM, sign out at 5 PM, and argue with your teammates about problems that entire swaths of engineers consider rudimentary: understanding how to write patches in readable ways, not screwing up git logs, not submitting PRs 10k SLOC long with no explanation as to how the automated output was created for replication and verification.

There are obvious clowns in the industry, and people who really love doing this stuff, and you can figure out who they are in a 5 minute conversation.

It really insults people when you tell them this, because yes, there are absolutely people that work harder and longer than you because they want to and enjoy it--they don't just go home and watch Netflix.


This is in my opinion very wrong, you will end up just selecting for a sub population which might or might have the qualities you are looking for.

You seem to be confusing volume of work and passion for skill. Some of the best engineers i know will are very diligent about work life balance (i guess you call that clock in/sign out), do not have publicly visible git hub because in their spare time they are either with their families/communities or enjoying other non tech related hobbies.

A good programmer (from an employer perspective) is first and foremost a professional, passion and loving doing stuff is good but can only get so so far. I know a lot of the passionate programmer who basically suck.

> There are obvious clowns in the industry

Agree, and the worst seem to be the one thinking they have this magical/predictive ability to discern talent based on their limited life experience.


People who do more will, by definition, have more experience than those who do less. The engineers who spend their time reading and working on problems because they want to, creating larger volumes of work than those who only work an 8-to-5 job will generally always have a larger breadth and depth of experience.

Saying the inverse is true is highly unlikely over large populations of individuals across any field, not just software development.

You're telling me that you think musicians who are "very diligent" and practice less can be "some of the best" you'll ever have exposure to compared to those who are working in music all the time, just because they like it?

It's a delusional concept. If you want to be good at anything in life, you will end up spending more time than other cohorts in any given discipline. But for some reason in tech, people like to believe that's not true because "work life balance."

Did it ever dawn on you that some people just like to write all the time? Or produce music all the time? Or paint? Or sing? Or act?

"Very diligent" people doing less than "very diligent" people doing more will generally always have less experience and skill.

There's nothing meaningful to argue here. Some people delude themselves into thinking it's sacrifice and that some people have to give up "work life balance":

The reality is that there are populations of people across all sorts of discipline where giving up more time doing X, Y or Z isn't sacrifice, it's because people genuinely enjoy doing more than others, spending time becoming better than others, and producing more:

And yet, that scares people, and people like to deny that it's true instead of acknowledging their own mediocrity.


I think it's more that there are multiple spectrums here and a lot more than two buckets to put people in.

Sure there are very passionate people who enjoy programming so much that they wish to do so far outside the standard workday.

Some of those people are also very highly skilled, experienced and organized.

There are also people who work a standard day, push Jira tickets around, and try to blend into the organization and hope nobody really questions how much they personally get done.

There are also plenty who have good work/life balance and are also skilled, experienced, good problem solvers, good communicators, and very valuable to have on your team. They might have some measurable productivity loss compared to your ideal, but probably not by 2x or 10x.

I have also run into several of the ultra-focused passionate folks who will stay up all night hacking at a problem to make it work, who produce prolific line-counts of code, and fall very much into your camp of 10k line indecipherable PRs that are definitely not going to be maintainable long term in a team or organization.

You can try to correlate some of these factors together, but it's perhaps not as simple as your original comment presented.


> People who do more will, by definition, have more experience than those who do less.

You're assuming skill rises meaningfully with just volume of experience. Who's a more skilled driver? The plumber that drives his van around to jobs all day and does about 1K miles/month or the guy that mostly rides his bike except for weekends when he's taking a defensive drivers course or going to a track day?

I'd bet $$$ that the plumber spends a lot of time checked out / in zombie mode on the freeway between jobs and the motor-sports enthusiast is hyper diligent when driving.

> Saying the inverse is true is highly unlikely over large populations of individuals across any field, not just software development.

Perhaps, but this is - again - because you're missing the point; meaningful advancement in skill comes from experience gained while attempting something an individual is new to/uncertain/uncomfortable with and not the same thing that the individual has done a thousand times before.

> You're telling me that you think musicians who are "very diligent" and practice less can be "some of the best" you'll ever have exposure to compared to those who are working in music all the time, just because they like it?

Yes. A simple counterfactual: not all musicians that practice 18 hours a day become successful. There's a lot of work in being the best, absolutely. But some people have some fantastic genetics/general-upbringing/predisposition to leverage. Same thing with sports. There are comedians that you've never heard of that spend more time writing jokes than world-famous comedians do.

> Did it ever dawn on you that some people just like to write all the time? Or produce music all the time? Or paint? Or sing? Or act?

Absolutely, but the people that do $thing all the time _and get better at it_ are the people that are constantly looking to $difficulty++ on $thing. I love reading and I'm always getting better at it because I don't stick with the same language/length/difficulty level all the time :).


Video worth a thousand words : https://www.youtube.com/watch?v=nMEzr5uvrNM


>> People who do more will, by definition, have more experience than those who do less. The engineers who spend their time reading and working on problems because they want to, creating larger volumes of work than those who only work an 8-to-5 job will generally always have a larger breadth and depth of experience.

Nope. Some people spend a lot of time doing because they need more time to get it right. Hopefully they do improve over time.


You continue to equate volume and time with quality of work. But still...

> People who do more will, by definition, have more experience than those who do less.

Disagree. The quality of work, and the intensity is as important as the length of work. 100hr of focused work is better than 1000hr of distracted, unfocused work.

It also depends on the type of work, i would argue that someone who does 50hr of Haskell and 50hr of c++ has more experience than someone who does 150hr of just c++;

It also depends on the type of experience, sure maybe ( a strong maybe) your lone 1000 github star can produce more code, but being a professional programmer is more than just producing code. Communication, planning and general don't be a doucheness are also important. A parent who is doing 8-5, and spend the rest of the time managing a family would score higher on those metric.

> The engineers who spend their time reading and working on problems because they want to, creating larger volumes of work than those who only work an 8-to-5 job will generally always have a larger breadth and depth of experience.

Same remark here, depends on the focus and intensity of work. It's a well documented things that after 40hr a week, the quality of work and focus tends to degrade.

But even more important, you are assuming that the time the "8-5 people" spend not working on computer related things somehow also doesn't count as experience, and can not somehow synergisticly enhanced one professional work.

At the base level, you have the foundation health related thingy like good diet, proper rest etc...etc.. which does take time. But also well know thingies like ideas poping in when one get some distance to a given task.

More important, cultivating other interest and stretch one's minds and have some interesting effect.

> Saying the inverse is true is highly unlikely over large populations of individuals across any field, not just software development.

Strawman, not really gonna touch this.

> You're telling me that you think musicians who are "very diligent" and practice less can be "some of the best" you'll ever have exposure to compared to those who are working in music all the time, just because they like it?

This is still a strawman. But at least it's more interesting.

To answer the question : Yes... its called talent,training quality and genetic predispositions.

If long hours of work was the only thing required, the profession of coach wouldn't exits. In sport, most of the top player are very motivated people, with ungodly work ethics and drive... They still invest in personal coaching because just "doing" is not enough, doing the right thing and the right way is also very important.

> It's a delusional concept. If you want to be good at anything in life, you will end up spending more time than other cohorts in any given discipline. But for some reason in tech, people like to believe that's not true because "work life balance."

> Did it ever dawn on you that some people just like to write all the time? Or produce music all the time? Or paint? Or sing? Or act?

I see this as faulty logic at multiple level. Even if i give you the fact that hard work/passion etc... are strongly correlated with excellence. It doesn't follow that hard work/passion etc... are a good selection criteria when looking for excellence.

A good analogy is height and basketball skills. It's pretty clear that being tall helps some might even say is required. But within the NBA (or any other organization of "professional basketball player), nobody is drafting people based solely on height. One might even say that the relationship between height and skills in the NBA is fuzzy at best.


Supposedly there is good data to suggest that people who work 10% longer make 40% more money. Now, that's not necessarily causal or anything, but it doesn't have to be. From a hiring perspective, it just has to be true. Unless you suspect that a candidate has scammed his previous employers, it is rational to prefer candidates who, based on their hours worked, are more likely to be effective at making money. Making money is generally the business of business.

"The U.S. Bureau of Labor Statistics reports that the average person working 45 hours per week earns 44% more pay—that is, 44% more pay for 13% more work"


Yeah, that describes me! I’m still insulted by this, because I have worked with plenty of people who basically do not exist on the internet but are far smarter than I will ever be. You’re wrong, plain and simple.


And a lot of times companies won't hire these passionate people.


They don't even know who they are because B-players only hire other B-players.


They know who these A players are but are afraid to hire them.


A rational economic system would eliminate the distinction between education and production so that people are constantly learning/improving while occasionally contributing to massive breakthroughs. We systemically under-develop every single human being on a global basis right now.


Everyone deserves an experience like that, at the very least – everyone deserves to be in a position where they have the opportunity to really enquire, or engineer, or know what it's like to solve new problems. It really is a good purpose that education could meet.


I'm frustrated that "good management" involves breaking things down so small that there is no room for enquiry or engineering. Jira ticket #5229 "Port get_order_* API" doesn't leave much room for problem solving.


It’s almost like business needs come first


Is there a world where we put people's needs first?


Really? There are tons of people who learn while working, which is what I assmue you are advocating for rather than literally "eliminat[ing] the distinction between education and production," which would include school children working. This is especially true in the top end of fields like medicine. Surgeons create and learn new surgeries, for example. And I don't think Sergey Brin took a course on how to create Google in university, and the Google engineers did not learn how to scale it there either. See also the massive research arms of Google et al. So "every single human being" is trivialy false.

But even in the general case, college does not directly prepare you for your profession (nor is it meant to) and you are expected to learn on the job. And if what you say is true that you are not learning while working there would be no reason for companies to seek employees with experience.


> ... 2 people can outproduce even the largest companies.

Only if they're not including documentation. Or maybe only very, very, very minimal documentation.

If you want full documentation, translated to a bunch of languages, then (for now at least) you'll need more than those 2 people.


Fortunately, in the case of a GPU driver, you don't need a lot of documentation beside the standard it implements...


> This puts Apple to shame, plain and simple. They obviously don't care about standards, or compliance, because they like people to be walled in their own little private garden

Is everyone missing the part where Apple left the door open for other operating systems and development thereof when it would have been relatively trivial for them to lock the laptops down.

They literally made their own silicon and built an entire platform. Do you think it's a mere mistake that they left it open to running other operating systems?

It's really disappointing to see everyone bashing Apple when it's clear to anyone paying attention that they made a conscious decision to leave the door open for 3rd party development.


> It's really disappointing to see everyone bashing Apple when it's clear to anyone paying attention that they made a conscious decision to leave the door open for 3rd party development.

Maybe I'm just a jaded grey-beard but I suspect that this is more of a "placate the anti-trust regulators" play and not a genuine olive branch offering.

Apple gets to say "see, look! Not only are we not locking people out - there's a whole micro-niche community that's taken root. If that isn't proof we're not abusing our position, I don't know what is..."

They left the door open, but just barely. The reverse engineering efforts will always be a step behind making sure that there's always going to be a "non-apple" experience that will be objectively inferior in one way or another.


I can’t imagine any reasonable argument that would make Apple be a target for an anti-trust action for _Macs_.

I understand skepticism and not always giving corporations the benefit of the doubt, but they _clearly_ spent a lot of time and resources to make third-party OSes viable on Apple Sillicon Macs.


They did. Which is why it’s so baffling that they didn’t document any of this stuff. 5 minutes of documentation by apple engineers on the boot process or GPU would have saved 5 hours of reverse engineering work by the Asahi Linux team.

Seems to me like they can’t decide whether they want Linux on their hardware or not. I bet different people in the org are pulling in different directions.


It’s not baffling at all. Opening the boot chain is work, but making presentable documentation is a lot more. It’s not 5 minutes of work: it’s years of checking the licensing on everything, designing stable APIs that are fit to publish, supporting them, having engineers working on this. You can’t just throw your internal “G13G scheduling pipeline” docs over the wall.


> _clearly_ spent a lot of time and resources to make third-party OSes viable on Apple Sillicon Macs.

This actually isn't clear to me -- can you explain? Besides keeping an open bootloader [0], I'm not aware of any affirmative actions Apple has taken.

[0]: https://github.com/AsahiLinux/docs/wiki/Open-OS-Ecosystem-on...


The open bootloader didn't magically appear one night in Apple's git repository.

It boots in a notably different way than iOS machines do, and has some (AFAICT) pretty unique capabilities, including a fully-verified signed-boot of macOS partitions, while allowing third-party kernels at the same time.

Asahi's "Introduction to Apple Silicon" [0], and specifically "Security modes, Boot Policies, and machine ownership" paragraph outlines some of that, Apple's "Platform Security" [1] whitepaper does too.

Asahi's docs also explicitly state the same thing [2].

If you still don't think that shows significant amount of work and care were put into deliberately allowing third-party OS's to work on those machines, I don't think I can convince you otherwise.

[0]: https://github.com/AsahiLinux/docs/wiki/Introduction-to-Appl...

[1]: https://support.apple.com/guide/security/welcome/web

[2]: https://github.com/AsahiLinux/docs/wiki/Apple-Platform-Secur...


There is also no precedent for Apple making any kind of pro-active design choices around future regulation. They clearly are the kind of company that does whats best for them and when asked to change, nudges in that direction, and then moves on. This is in the DNA from the top down. It would certainly be weird to make the decision about third-party OSes be about that.


> I can’t imagine any reasonable argument that would make Apple be a target for an anti-trust action for _Macs_.

Why can't the same "there is no OS except iOS allowed on iPhones" argument be applied here? If the only os that boots on a macbook is macOS, that's starting to smell like anti-competitive behavior the same way that only app store approved apps can run on iOS is anti-competitive.


Because the market share is order of magnitude smaller.


This is one moment where I really hate what happened to Twitter, since I feel like I recall a tweet from Marcan ages ago pointing out that Apple has fixed some things regarding 3rd party OS support.

That is to say, unless I am truly off my rocker and remembering a fever dream: it's not just a "placate the anti-trust regulators" play.

I'm pretty sure I've also seen it mentioned on HN itself that Linux is still used within Apple for certain aspects of hardware development, so Apple themselves need it to work to a certain degree.


I remember reading that too, but I think it was on mastodon — I don't feel like tracking down that particular thread now, but maybe that helps you on your search :)


> I remember reading that too, but I think it was on mastodon — I don't feel like tracking down that particular thread now, but maybe that helps you on your search :)

That would be strong evidence that there's at least _some_ support internally for them but doesn't explain why they bothered at all.

The lack of explicit endorsements and documentation certainly has me thinking that at least _some_ of apple doesn't want this happening at all so they're at least going to make it hard. It may not be a "what's the bare minimum support we have to do to avoid being a poster-child for anti-competitive behavior" that's completely driving it after all.


> but doesn't explain why they bothered at all

I mean, you're kind of glossing over my second point from my comment:

> I'm pretty sure I've also seen it mentioned on HN itself that Linux is still used within Apple for certain aspects of hardware development, so Apple themselves need it to work to a certain degree.

Anyway, I went and dug around and found the HN discussion of that Marcan tweet that's been deleted - you can browse it below if you missed it or are curious:

https://news.ycombinator.com/item?id=29591578

Notably, this comment from Saagarjha who I trust on Apple-related matters is what I was referring to regarding Apple using Linux internally for some of their work:

https://news.ycombinator.com/item?id=29599889

All this to say, if the people who have some level of vested expert knowledge in this domain - like Marcan or Saagarjha - don't buy the conspiracy theory angle, then I'm inclined to side with them.


Here’s a Wayback Machine for that tweet: https://web.archive.org/web/20220102153759/https://twitter.c...

But I have to say I don’t understand how what Saagar is saying is supporting your (our?) point. Apple has ability to do a whole lot of things that will never make it to end-users — just because some flavor of Linux is being used in the CPU bringup process doesn’t mean anything for the final products.

As evidenced by the fact that M1 is far from the first chip they brought up in-house - and even then only on Macs, not on iPads which use the same chip.

I hear they even have some non-Apple hardware running macOS in data centers, the absolute horror ;P


If you really care about OpenGL drivers, it sounds like you want to be on Asahi, not macOS. Doesn’t seem like they’re always one step behind?


I just don't see how it's incumbent on Apple to implement a standard only because it exists.

It takes significant resources and impacts release schedules to, say, support an on-going vulkan implementation... I think there would need to be a business argument for it. Avoiding "shame" probably doesn't cut it.

It just strikes me as strange when people expect Apple to spend money and focus on their own personal priorities... especially for things that are inherently community-driven.

Perhaps there's a "community goodwill" argument to make, though I doubt they'll try to chase the goodwill of people who complain that Apple "has become is just a corrupted mess of greed behind a curtain of politically correct marketing videos."


>> I just don't see how it's incumbent on Apple to implement a standard only because it exists.

Well Apple does implement the standard. They DO have OpenGL, but their driver is not fully compliant apparently.

>> It takes significant resources and impacts release schedules

Well it seems 2 people working for a couple years managed to do it without much in the way of documentation that Apple clearly has (they built the F-ing thing).

Apple has considered OpenGL deprecated for a long time now, but they do support it (they brought it to M1 and M2) and it makes sense to keep doing so. They really should be more conformant. If a standard is worth supporting, it's worth supporting well - and Apply has unlimited funds in comparison to these two people.


They support it because they built some stuff on it at some point. They just want those things to continue working. They have no further need to support it.


> It takes significant resources and impacts release schedules to, say, support an on-going vulkan implementation...

Let's see... on one hand we have two devs who made it happen (for OpenGL; with Vulkan coming in the future), with essentially no funding, purely in their spare time and without any documentation, just by reverse engineering the hardware.

On the other hand we have a trillion dollar corporation, with 100 billion dollars in profit last year, with over 150k employees, with full documentation for the hardware.

Yep, it checks out. Apple's definitely resource strapped and can't afford to do it. No way they can compete with two spare-time hobbyist. Would be too expensive.

I'm sometimes really amazed how Apple-biased a big portion of HN is. People here can always find some sort of excuse to justify whatever shitty new thing Apple is doing (or why it isn't doing something), no matter how much mental gymnastics it requires.


No one said they don’t have the resources. They just asked why it’s incumbent on Apple to spend them, which your response.. doesn’t address at all? If we’re going to post with that much sarcasm let’s try to have equal amount of substance to go with it?


Developers don't want to create seperate Apple only backends for everything. Users want to use software, they don't care about what backend is used. When Apple has to personally implement a metal backend for blender, it's obvious that there is a problem. This is despite blender being one of the most well-funded open source projects. Having to support multiple backends also increases development time and future tech-debt surface area. It's mostly not a moral arguemt as you seem to imply. This is just a feature that developers and users want. Apple is free to not implement it. It just means that many apps will not support Apple, or they will support it at the cost of other features, or they will use MoltenVK.


What? I'd argue the graphics APIs are one of the most important interfaces right now. It's not that we all feel good if apple would adhere to standards for once, it's supporting a legacy of applications that have been build over the last 20 years or so. Allowing a platform to be used however the rest of the industry has been evolving, not cheap vendor locking


I personally do not believe in those 'the market will fix it' arguments, there is not only community goodwill but also some corporate responsibility to support open standards. Since apple is dominating a large part of the market and strongly profits from locked in effects, open standards are the few things that make competition over the best OS (aka a market) work. I understand however, that it is mostly commercial or regulatory incentives that makes current big tech moce. A bit of shame is still approriate IMHO: they could easily also support an open source driver without releasing something official.


> I just don't see how it's incumbent on Apple to implement a standard only because it exists.

I mean it isn't. But it would be nice if they supported at least any graphics standard.


It's because OpenGL is a dying specification based on an outdated, horrible programming model, and Khronos's conformance tests are weird and ridiculous. The conformance tests are not open-source (there's a separate "conformance suite" available on GitHub which is based on Google's dEQP, not the Khronos internal test suite), and there can be some pretty major bugs and gaps in your implementation while still getting it "certified standards-compatible".

Apple has committed to support for OpenGL 3.1, and that's it. They even rewrote their OpenGL driver for Apple M1 to be an emulation layer on top of Metal, so that existing applications keep working, but they're not going to implement any newer versions of OpenGL. Nor should they.

I have a lot of criticisms of Apple, and I think they could be doing a lot to make Metal a better API with better tooling, but not caring about OpenGL is a perfectly sane decision here.


For GLES all the relevant tests are open-source, under KhronosGroup/VK-GL-CTS on GitHub.

There's a small set of legacy "confidential" tests that you have to pass for GL conformance. They can't be open-sourced for legal reasons. The current CTS working group would like to get rid of them, but it's hard to justify spending time on GL these days...

You can definitely pass conformance with a driver that's horribly broken in practice. GL/ES/GLSL are huge and there are holes found in these specs all the time. And it's not like game developers read them anyway; whatever works on their test devices gets shipped.


Apple themselves tell you to use Metal instead of OpenGL on Apple platforms. It’s not beating the big corporation if the big corporation isn’t in the competition.


Agreed. It's a misrepresentation. Apple doesn't do it because it's too hard, they don't do it because they don't see value in doing it.


They probably see alternatives as providing negative value vs. total clarity from the platform owner re: how to use the GPU.


> This puts Apple to shame, plain and simple.

Not really? Like, we can look at this and point fingers, but Apple will not be ashamed, because they do not care. They deliberately don't bother with standards conformance unless doing so is a part of their strategy. Standards conformance for OpenGL or Vulkan doesn't make sense when they want everyone to use Metal.

It's dumb, and I marvel at how childish Apple always behaves with things like this, but that's just who they are.


> They obviously don't care about standards…

Apple has invented, contributed to, and adopted a long list of standards. You can easily Google or ChatGPT the list if you don't know your tech history.

Apple deprecated OpenGL and OpenCL support in 2018 for very reasonable technological and strategic reasons that I understand you disagree with. But that doesn't change the fact that OpenGL is a terrible fit for modern computing/GPU architectures.


Apple also invented and contributed OpenCL. Then later Khronos invented OpenGL compute shaders, which is completely different.


How did they beat the big corp who probably didn’t even have this on the agenda? Or any other big corp that wasn’t planning on doing this? I’m all for it but they clearly did something someone else wasn’t willing or even planning to do.


I've been watching the Asahi Linux project from the beginning from the sidelines. I find it both exciting and fascinating for two reasons:

1) It's a project that a lot of people want to see happen, and

2) It's a stellar example of a well-executed open source project at all levels.

Moreover, Asahi has been my daily driver since the alpha was released back in March 2022. I migrated to Fedora Asahi Remix (their new flagship distro) earlier this month, which is excellent (https://jasoneckert.github.io/myblog/fedora-asahi-remix/).


Yep. Just switched to PC for Davinci Resolve renders/exports and built a threadripper 64 core with 128G of RAM, 16 TB of SSD, dual RTX 4090 GPUs and dual thunderbolt 4 ports with awesome cooling for $15K. Something worse from Apple would cost you three times that and be less serviceable.


I’m curious how that is possible, considering that a fully maximal configuration of a Mac Pro (which including pre-installed FCP and Logic) costs less than $13k.


Genuine question. Why would apple make a driver for something they don't support? and if they wouldn't, is there really any shame?


well, the article mentions webgl


These two are tremendously cool.

The story seems really complicated from a technical point of view.

Whom is OpenGL ES 3.1 compliance for?

Apple is shipping DirectX compatibility in Game Porting Toolkit.

I understand you are making a stylized comment. My stylized comment is, there's a lot of stuff going on everywhere, all the time, with all sorts of technologies. You're not illuminating for me, compared to all the other people toiling in obscurity, what about this has pressed so many buttons for the Hacker News audience? Because it's not OpenGL ES 3.1 compliance.


Apple ended up in a battle with a patent troll over FaceTime, which could be part of why it hasn’t been opened up.


Lol. It's intentionally closed, so network effects can drive usage. Common Apple strategy, straight from Jobs.


Jobs is the guy who claimed they were going to make it an open standard originally.


Jobs also repeatedly lied and backstabbed his business partners when he thought he gained an advantage. As we see here.


Jobs just added that to the keynote presentation after running it past nobody.


He was the CEO, if he wanted it done it would have happened.


Yes, Jobs, the great satan of tech. He died so that Tim Apple could sin with impunity. No FaceTime for Windows users — it will bring the rebels to their knees.



I can't believe it. I can't believe it. Wow. Alright, now we're getting somewhere.


I'm just glad we now have much better alternatives to FaceTime that Apple will soon have no choice but to open up.


> If i weren't an iOS dev, i would have ran away

I used to wonder why anybody who is not an iOS developer would buy a Mac. Now I just accept that some people make different choices than I would - just with most everything in life.


If somebody makes a laptop as energy efficient as an apple silicon macbook, while being more serviceable, running linux etc, I will gladly buy that one. I am definitely not staying in apple for Macos.


Go into any Bay Area coffee shop and you’ll see it’s filled to the brim with zoomer developers developing their apps and projects on a Mac. Doesnt matter at this point why or how but it’s more or less become the standard development platform for a lot of stuff, or at the very least the frontend for it.


> This puts Apple to shame, plain and simple

But Apple clearly has zero interest in OpenGL. Thats been obvious since before the M series chips.

What they’ve done so far and continue to do is incredible. I love following what the Asahi team has done.

But anyone can win a “race” against an opponent who refuses to play.

Shame Apple for not playing the game if you want, but they could have easily done this if they wanted to.


You say that but the financial barrier to entry for using Facetime and iMessage mean I have not once received spam. That's a real feature in the modern day.

Apple are by no means perfect but over pretty much every other service (shout out to Mastodon being too small to bother with I presume) I have used I have received spam.

Advocates of truly neutral carriers, open code, et cetera, et cetera, so on and so forth don't really have an answer to spam in my experience. I would say that almost no one has an answer to spam except seemingly Apple. And while their answer sucks it is an answer.

In the modern world telecommunications are a requirement to participate in society. I don't think it's unreasonable to have spam be your number one concern. It's a waste of my existence even though I ignore all of it and take appropriate pro social actions by using whatever kind of report I have available to me. There are people who fall for the scams (I have met and even know a few) and they have lost heaps of money.

Apple would be destroying a huge amount of value for their users were they to open up their service more easily to spammers. It's not our job to give up our lives so someone's theoretical version of a better society can exist.

I say this with every bit of code I have ever published being GPL3 and having advocated it in every single project I have ever worked on. But I have never built a telecommunications platform.


> Advocates of truly neutral carriers, open code, et cetera, et cetera, so on and so forth don't really have an answer to spam in my experience.

Seems trivial by but allowing anyone to message anyone else unless they exchanged their contacts physically (by showing each others QR code or something).


Right, aside from friction for users, spammers will attempt to find ways around any system you implement. They are malicious actors. I don't think protecting a system against malicious actors is trivial.


If Apple didn't care, they wouldn't release anything at all.

But they did. It means somebody did care.

But they failed, got beaten by almost one-man-army "team" that started running miles behind from another country behind chain of mountains. It's unreal.

It's unreal because Apple has everything - talent, hardware and software and got beaten by reverse engineering the whole thing.

Such a slap in a face.


We're kidding ourselves if we think Apple was putting an earnest effort into this.


They don’t remotely care about supporting a standards complaint graphics api. It’s an anti-feature, the opposite of what they want (I worked there for a bit on gpus). If I had to guess the two things they care about it’d be 1) their own ecosystem (ie metal) and 2) what’s popular for game devs (ie hlsl dx and windows). So far that’s seems to be what they’ve been working towards shipping.


> If i weren't an iOS dev, i would have ran away from the apple ecosystem a long time ago.

Yeah, instead of drivers for M1, I'd be more impressed+happy if someone implemented the iOS APIs running on Linux, like we have Wine for Windows on Linux.



Did anyone use this to write iOS apps under Linux?


OpenGL(ES) are basically old legacy APIs at this point, it’s not too surprising that Apple won’t invest in them besides existing app conpatibility right?


What does Apple get from not making their laptops the most performant portable gaming machines out there?


They lost the gaming market a while ago so they don't care anymore. The gaming situation on a Mac is worse now than 10 years ago.


putting apple to shame implies that they have some form of honour, which is clearly not the case, they don't care


They learned from Microsoft in the 90’s.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: