QQ: does it support programmatic cell access/modifications?
Eg `cell file.cell --write A2 "42"` or `cell file.cell --read "=SUM(A1:A10)"`? Couldn't surmise that from the glimpse I gave the README, but it would be pretty cool for scripting stuff.
Apart from Anthropic nobody knows how much the average user costs them. However the consensus is "much more than that".
If they have to raise prices to stop hemorrhaging money, would you be willing to pay 1000 bucks a month for a max plan? Or 100$ per 1M pitput tokens (playing numberWang here, but the point stands).
If I have to guess they are trying to get balance sheet in order for an IPO and they basically have 3 ways of achieving that:
1. Raising prices like you said, but the user drop could be catastrophic for the IPO itself and so they won't do that
2. Dumb the models down (basically decreasing their cost per token)
3. Send less tokens (ie capping thinking budgets aggressively).
2 and 3 are palatable because, even if they annoying the technical crowd, investors still see a big number of active users with a positive margin for each.
$1000/mo for guaranteed functionality >= Opus 4.6 at its peak? Yes, I'd probably grumble a bit and then whip out the credit card.
I'm not a heavy LLM user, and I've never come anywhere the $200/month plan limits I'm already subscribed to. But when I do use it, I want the smartest, most relentless model available, operating at the highest performance level possible.
Charge what it takes to deliver that, and I'll probably pay it. But you can damned well run your A/B tests on somebody else.
You will, but many many many others won't do it, probably. I mean, in some parts of the world 200$ is already a big chunk of their monthly income and a price hike will definitely push them away, which is bad for the upcoming (potential) IPO.
Yeah, you are using the wrong tool if you send your newsletter from a gmail account at that scale. You can get away with a few tens of people, perhaps a few hundreds.
Above that threshold you should use tools like moosend, benchmarkemail, or similar. And they ask a pretty penny when you reach that scale.
I believe they can't afford anymore to subsidize inference with VC money or
that they are trying to get their balance sheet in order for an IPO.
So they could be trying to tighten the thinking budget (to decrease tokens per request) or to lobotomize the model (to have cheaper tokens). I mean, no-one is really sure how much a 200 dollars/month plan actually costs Anthropic, but the consensus is "more than that" and that might be coming to an end.
This explanation falls well in line with the recent outrage about out of quotas error that people were reporting for the cheaper (or free) plans.
I like this change. I was wondering if I would've preferred to have something on the function signature (eg `tcc_fn foo() ...` as in Tail Call Constrained fn) and when encountering that the rust compiler would make checks about whether the body of the function is tail-callable.
My fear is that adding yet another keyword it might get lost in the sea of keywords that a Rust developer needs to remember. And if recursion is not something you do often you might not reach for it when actually needed.
Having this signal in the function signature means that people would be exposed to it just by reading the documentation and eventually will learn it exists and (hopefully) how to wield it.
The property we care about isn't a property of functions but of callers, so marking a function doesn't help.
`become blah(foo, bar)` is the same thing as `blah(foo, bar)` except that we, the caller are promising that we have nothing further to do and so when blah returns it can return to our caller.
If somebody else calls blah they don't want that behaviour, they might have lots more to do and if they were skipped over that's a disaster.
In some languages it's very obvious when you're going to get TCO anyway, but Rust has what C++ calls RAII, when a function ends all the local variables get destroyed and this may be non-trivial work. Presumably destroying a local i32 is trivial & so is a [u8; 32] but destroying a local String isn't, let alone a HashMap and who knows how complicated it is to destroy a File or a DBQuery or a Foo...
So in a sense "all" become does is try to bring that destruction sooner a little, so it happens before the call, leaving nothing to do afterwards. We weren't using that String any more anyway, lets just destroy it first, and the HashMap? Sure, and oh... no, actually if we destroy that Foo before calling blah which needs the Foo that messes things up... Rust's borrowck comes in clutch here to help us avoid a terrible mistake, our code was nonsense, it doesn't build.
Given that it's not really that uncommon to see something like `pub(crate) async fn foo() ...`, the concern of function signatures starting to get unwieldy feels a lot more concrete than hypotheticals about a "sea of keywords". From looking at the list of keywords in the language currently (listed here: https://doc.rust-lang.org/std/#keywords), I don't really see a whole lot that I think the average Rust programmer would say is a burden to have to remember. At most, `union` and `unsafe` are probably ones that most Rust programmers are not going to need to use directly, and `SelfTy` might look a bit confusing at first due to the fact that the keyword itself is spelled `Self` (and presumably listed in that way to make it easier to differentiate from the `self` entry in the documentation), but even including those I'd argue that probably over half aren't specific to Rust.
As for being in the documentation, I'd argue that might even be an explicit non-goal; it's not clear to me why that should be something considered part of the API. If someone updates their tail recursive function to manually loop instead (or vice-versa), it doesn't seem like that should necessitate the documentation being updated.
I'd actually say that for people learning Rust after something like C or C++ in particular the rare cases where a keyword means something else are the most confusing. In particular `const` in Rust means constant whereas in several languages it means an immutable variable.
const NINE: i32 = // Some arbitrary *constant* expression;
In K&R C this qualifier didn't exist so there's no confusion, but C89, all versions of C++ and several other languages inspired by them use "const" to mean an immutable variable.
That's a fair point, and maybe even a case that there should be more keywords rather than fewer.
Relatedly, I still sometimes get tripped up by the nuances of using `const` versus `static` for top-level constants. Most of the time the difference is entirely opaque to the programmer (because it's not obvious when most things are getting inlined or being referenced from a single place in memory), but it's possible to run into cases where one works and the other won't (e.g. trying to be clever with `OnceCell` rather than `OnceLock`).
It might help to think about whether you want an actual singular concrete thing, which means you need static or whether you just want to talk about the idea and so it doesn't matter whether at runtime this exists many places or nowhere at all, which is a const.
Statics can be mutated - though not safely - because they are a single concrete thing so they can be changed, whereas it can't mean anything to mutate a constant, hence the word "constant".
For larger objects you might want a single concrete thing even though it might intuitively not seem important because it impacts performance. For example if we keep talking about FACTOR[n] where FACTOR is an array of a million numbers (maybe computed by scientist colleagues for your application) and n is a variable, if FACTOR is const Rust is going to just put a copy of that enormous array everywhere it needed to do this indexing operation, which gets out of hand really fast, whereas if we use static we get a single concrete thing, named FACTOR and everywhere in the program will use that one single million number array, much tidier and less likely to result in say, running out of RAM on a small computer.
Yeah, the problem is that I usually think about stuff like this up front and then select something, and when much later I happen to change something, I focus so much on the types and the values that I forget that I need to also look at the keyword itself.
For what it's worth, my rule of thumb is usually to start with `static` and then only swap to `const` if I have a reason to. If I recall correctly, the issue I alluded to above was around picking between `LazyCell` and `LazyLock` and swapping between `&Path` and `PathBuf`, and some combination of them only working with `const` and not `static`.
Isn't salary a proxy of how hard to replace one person or a group of persons is or how valuable they are?
There was a surge in demand for SWEs and scarcity brought salaries up. Are them too high? Hell no. On average, my colleagues and me generated ~2M$ each in 2025 for our company, while we get payed a fraction of that (grants and bonuses included). If you look at net income per employee we are at around 700k each in 2025.
Additionally, employers try their hardest to drive costs down (eg. offshoring as much as possible, everyone doing layoffs at the same time, ...) and average/median salaries remained high. If the salaries were overinflated those numbers should have came down I believe. The fact that they didn't makes me think that it still is a scarcity problem not an overinflation one.
>There was a surge in demand for SWEs and scarcity brought salaries up. Are them too high? Hell no. On average, my colleagues and me generated ~2M$ each in 2025 for our company, while we get payed a fraction of that (grants and bonuses included). If you look at net income per employee we are at around 700k each in 2025.
So by that logic, housing in coastal cities also aren't "overinflated"? After all, like SWEs, they're they're also scarce and in demand. They're also providing enormous value to the people buying/living in them, otherwise they'd be living in Oklahoma or whatever and paying a fraction of the cost.
Maybe we give different meanings to the overinflation word. I see it as something that is speculative/shady in nature. Is housing overinflated? Probably in some places for sure because those who already have a house or invested in real estate wants to cut down supply to raise prices.
Is the same on the job market? I don't think so. I never heard any SWE saying "let's scare people away from a CS career so we can bargain for higher salaries". The opposite is true though. Companies participate in career fairs, pre-uni events to make people gravitate towards a CS careers, ... so with a higher supply each employee loses a bit of bargaining power.
Small excursus, this very fact was taken to the extreme in 2022 when everyone did layoffs at the same time despite the numbers being still great. If you put 300k people on the street at around the same time you can hire some of them for way less money as they now lost all leverage (since there are other 299.999 people waiting in line for a job).
Ya, that sounds right to me. Coastal city housing is very supply constrained, part of why it's so expensive, but it is hugely in demand and provides tons of value to many by letting them live near high paying companies. Unless by "overinflated" you mean a constrained supply/demand curve?
With my setup (GhosTTY, tmux, nvim) I don't have any problems honestly. When working with UI stuff I use rectangle to get a bit of the tiling behavior I was used to on i3, but nowadays I need that less and les because of browsers adding split view within themselves.
Battery is great and everything feels snappy even after the PC being powered on for weeks.
QQ: does it support programmatic cell access/modifications?
Eg `cell file.cell --write A2 "42"` or `cell file.cell --read "=SUM(A1:A10)"`? Couldn't surmise that from the glimpse I gave the README, but it would be pretty cool for scripting stuff.
reply