Not even a summary of what’s on Wikipedia, usually a summary of the top 5 SEO crap webpages for any given query.
Not even a summary of what’s on Wikipedia, usually a summary of the top 5 SEO crap webpages for any given query.
Depends. If they get access to the code OpenAI is using, they could absolutely try to leapfrog them. They could also just be looking at ways to get near ChatGPT4 performance locally, on an iPhone. They’d need a lot of tricks, but succeeding there would be a pretty big win for Apple.
I think I’ve tried this before but will give it another shot, maybe I just got the regular one.
I wonder too if there are genetic differences at play. Like folks that taste cilantro differently.
Anyways if it’s 90% as good as milk then that’ll be good enough for me to switch haha, thanks!
Please give me recommendations of oat milk that tastes good. I’ve been desperately looking and/or hoping for bacterial production to kick off to make it more environmentally sustainable, but I haven’t found anything that tastes remotely as good (on its own or in a latte). I drink ultrafiltered milk for what it’s worth, usually 2% so I don’t need the creamy aspect, I just like the flavor.
Yeah, I think in general people come up with veiled reasons for lower taxes but fundamentally it’s just because they don’t want to pay taxes at all.
I like taxes. I like having roads (though I want more public transportation), I like having firefighters and public parks and protected green spaces and…
Well they’d argue that the money they put in is taxed, presumably because it was income and subject to income tax. So any income used to assess tax, they’d argue is taxed.
I’d just argue any income (including from capital gains) should be taxed according to your wealth. I don’t care if it has already been assessed for tax. If it’s income, and you already have excessive wealth, you should be paying a hefty tax. The point of taxes is redistribution of wealth and communal improvement (eg infrastructure) so I really don’t care if something is taxed once twice or more times, I care that wealth is taxed and used for public good.
So the real / original answer to this was the idea that we should avoid double taxation. If you were taxed on income already, and then invested that income which is now post tax, that capital gains then should be less taxed (or some argue not taxed) because you already paid taxes on it.
I’m of the opinion that I think taxes should be based on any income you make, based on the wealth you have. Source of income for the wealthiest should be irrelevant (and yes this includes in my mind realizing gains from stocks by borrowing against them).
Almost. If you own a share of a company, you own a share of something fungible, namely literal company property or IP. Even if the company went bankrupt, you own a sliver of their real product (real estate, computers, patented processes). So while you may be speculating on the wealth associated with the company, it is not a scam in the sense that it isn’t a non fungible entity. The sole value of crypto currency is in its speculative value, it is not tied in theory or in practice to something of perceptibly equal realized value. A dividend is just giving you return on profit made from realized assets (aforementioned real estate or other company property or processes), but the stock itself is intrinsically tied to the literal ownership of those profit generating assets.
Except, you know, the stock being tied to ownership in a company that sells real goods or services. Definitely problems with how stocks are traded, but they’re quite different from crypto.
This so much. People need to recondition that entertainment is not news, and news should not be entertainment. News should be description of noteworthy (even if boring) factual events. A presumptive presidential candidate talking about violating the constitution of the United States, that is noteworthy; maybe not unexpected in this case, but noteworthy.
We need to be less entertained by news and more informed by news. Tell me what presidential candidates are saying in the most mundane terms possible. Anything beyond the barest oblique (as opposed to direct) fact or factual description should be eschewed.
I mean you can model a neuronal activation numerically, and in that sense human brains are remarkably similar to hyper dimensional spatial computing devices. They’re arguably higher dimensional since they don’t just integrate over strength of input but physical space and time as well.
I mean if he broke the law and there is enough evidence to get a conviction amongst a jury of his peers then, like, yeah, go for it. I don’t want any president or any citizen to be able to claim immunity just because they held political office for some period of time. Like if you can’t lead the country legally then don’t lead it? Don’t do the crime if you can’t do the time or some platitude.
I think in general the goal is not to stuff more information into fewer qubits, but to stabilize more qubits so you can hold more information. The problem is in the physics of stabilizing that many qubits for long enough to run a meaningful calculation.
What games do you play in particular that is abysmal on Linux?
Guy Person is a racist troll.
Databricks is in the top 35% of similar companies in terms of diversity. So really I guess if they were trying to say this was an achievement without people of color and diversity I’d guess they just self owned themselves since in fact it’s considered diverse in its field.
Edit: I shouldn’t use gender assumption language even though guy is fairly gender neutral where I am (you guys).
Double edit for a source: https://www.comparably.com/companies/databricks/diversity
It is tiny, but far more parity in terms of arms and the whole being an island thing makes it exponentially harder to invade than say, a country you share a a land border with including roads leading you to where you want to go.
It may be no different than using Google as the search engine on safari, assuming I get an opt out. If it’s used for Siri interactions then that gets extremely tricky for one to verify that your interactions aren’t being used to inform adds and or train an LLM. Much harder to opt out vs default search engine there, perhaps.
LLMs do not need terabytes of ram. Heck you can run quantized 7billion param models on 16gb or less (Bloom, Falcon7B — falcon outperforms models with higher memory by the way, so there’s room here for optimization). While not quite as good as openAIs offerings, they’re still quite good. There are Android phones with 24gb of ram so it’s quite possible for Apple to release an iPhone pro with that much, and run it similar to running any large language model on an M1 or M2 Mac. Hell you could probably fit an inference only model in less. Performance wouldn’t be blazing but depending on the task, it could absolutely be sufficient. With Apple MLX and Ferret coming online it’s totally possible that you could, basically today, have a reasonable LLM running on an iPhone 15 Pro. People run OpenHermes 7B for example which uses ~4.4GB to run, without those frameworks. Battery life does take a major hit, but to be honest I’m at a loss for what I need an LLM for on my phone anyways.
Regardless, I want a local LLM or none at all.
This is a really bad look. It will probably be the case that it will be an opt in feature, and maybe Apple negotiates that Google gives them a model they house on premises and don’t send any data back on, but it’s getting very hard for Apple here to claim privacy and protection (and not that they do a particularly good job of that unless you stop all their telemetry).
If an LLM is gonna be on a phone, it needs to be local. Local is really hard because the models are huge (even with quantization and other tricks). So this seems incredibly unlikely. Then it’s just “who do you trust to sell your data for ads more, Apple or Google?” To which I say neither, and pray Linux phones take off (yes yes I know root an Android and de google it but still).
This should actually work against them. It would be more like “See, we’re not interested in competing, we’d rather maintain monopolies and cartel it up!”
They don’t, but with quantization and distillation, as well as fancy use of fast ssd storage (they published a paper on this exact topic last year), you can get a really decent model to work on device. People are already doing this with things like OpenHermes and Mistral (given, 7B models, but I could easily see Apple doubling ram and optimizing models with the research paper I mentioned above, and getting 40B models running entirely locally). If the start of the network is good, a 40B model could take care of a vast majority of user Siri queries without ever reaching out to the server.
For what it’s worth, according to their wwdc note, they’re basically trying to do this.