Hacker Newsnew | past | comments | ask | show | jobs | submit | mbesto's commentslogin

It seems the throughput has evolved over each spec but the reliability and distance hasn't, unless I'm mistaken? This is a big problem in place where concrete is used to build homes (e.g. the tropics) as the improvements to Wifi are basically not really relevant.

And 802.11ah, in 900MHz which has some hope of penetrating such walls, is still very scarce and the hardware fairly expensive.

Otherwise you just have to run a wire through the wall and put an AP in the room. Your clients can still be wireless for the last few feet, which preserves the convenience of usage, just not the convenience of deployment.


This is basically my strategy.

At some point, you just can’t beat physics. For best results you need multiple aps anyway.

> some of the best engineers I've ever worked with were at INST.

> You can see for yourself as it's AGPL and I assume you looked at the code

Can you look at any codebase and tell me it's written by some of the best engineers and it's not trivial?


Burning Man forces you to really think hard about social contracts.

For example - you won't get kicked out for leaving trash all of the ground but you will absolutely be shunned and shamed by everyone around you for doing so. That notion simply doesn't scale to a place like the US with 350M people with varying cultures, values, etc. because the social contracts are simply all over the place and inconsistent.


> These data centers seem to have wild amounts of money for investment, why not just mandate conservation requirements

This IS the complaint.


By all companies? I'd say less than 10% of all LOC today are generated by LLMs.

Really? In my bubble of internet news it seems the sheer number of companies that have formed and shipped LLM code to production has already surpassed existing companies. I've personally shipped dozens of (mediocre) human months or years worth of code to "production", almost certainly more than I've ever done for companies I've worked at (to be fair I've been a lot more on the SRE side for a few years now).

probabilistic != deterministic

Eh, it does and it doesn't. PE investors actively are asking why more of the portfolio companies aren't generating codebases using Claude Code. You are right that lawyers are asking about code generated by LLMs but this is more of a CYA out of ignorance more than anything else (btw - many purchase agreements have funny representations like "your code is free of bugs" which is downright hilarious).

So these two things are squarely at odds with eachother...meaning, I don't know any PE acquirers who are actively terminating deals because the target acquisition's code is generating by an LLM even if the lawyers try to get a rep about it in the purchase agreement.

For the record, I still have yet to have an M&A lawyer explain to me unilaterally that AI generated code is an infringement...hence the question "who owns the code Claude Code writes" is still open.


The tension you are describing is real and the piece does not capture it well enough. PE acquirers pushing portfolio companies toward Claude Code while their lawyers are adding AI code reps to purchase agreements is exactly the gap that will produce the first painful deal. The rep usually survives unsigned because neither side has done the analysis. When the first deal falls apart or a rep is breached post-close because of GPL contamination in an AI-assisted codebase, that will set the market standard faster than any court ruling.

> When the first deal falls apart or a rep is breached post-close because of GPL contamination in an AI-assisted codebase, that will set the market standard faster than any court ruling.

Assuming it ever does...first, GPL is hardly enforced and second, I feel like there is going to be enough money (e.g. Anthropic's own code it uses for the harness) that pushes back against it being problematic. We'll see.


Totally agreed.

I work in M&A. Nearly every lawyer, accountant, investor, and software business owner thinks their code is solely valuable and a trade secret. I find it hilarious and try to be as diplomatic as possible about why it's not. They also willfully will give their client list to a potential acquirer but get super cagey they moment a third party provider asks for their code to be scanned.

This argument easily gets shut down when I asked why, Twitch, a $1B business didn't crater to their competition when their full codebase was leaked.


What's probably WAY worse than this is that most healthcare providers running OpenEMR are likely on older versions of OpenEMR where CVEs are already detected.

Nobody uses OpenEMR. No chance. They are lying about their numbers.

Well, it's not popular maybe on bigger hospitals, but back in the day I think it was relatively popular on smaller practices even on the US. I don't know if it has lost traction (or not) with the popularization of cloud services, I'm not super up to date...

I can't speak to OpenEMR, but OpenMRS is popular overseas, and has done a lot of work in Africa.

OpenEMR may be in similar spaces.


The Nike Zoom Vaporfly's already had set this precedent years ago: https://www.nytimes.com/interactive/2018/07/18/upshot/nike-v...

The big improvement then was a carbon plate. Adidas (and others) followed suit. The subsequent improvements since then have been marginal but the margins are thin at that level. In this case the big advancement has been the weight of the shoe.

EDIT: Also it's worth noting these shoes are $500 retail. Adidas will for sure get a boost in sales from this, but there's definitely competition in the $200~$300 marathon running shoe space that won't solely draw everyone to Adidas)


Do these new Adidas shoes have anything major over the Vaporfly shoes? Maybe they are a bit lighter?

I think the big story here may be the nutrition science to get these guys to absorb a lot of carbs during the run, more than the shoes.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: