How does that work? Based on imei perhaps? Does spoofing that not do the trick?
How does that work? Based on imei perhaps? Does spoofing that not do the trick?
I just checked. In the online stores of the 3 largest tech chains in my country, there’s exactly one 16:9 40+" monitor model available, and that’s a 43" VA panel. The other 40+" stuff are weird absurdly wide curved monitors and some smart whiteboard type thing. So forgive me if I am extremely doubtful of your claim.
Lemme just pluck a 52" monitor from the 52" monitor tree where 52" monitors grow bountifully.
The problem has two sides: software and hardware. You can open source the software side all you want, it’s not gonna go very far when it has to fight against the hardware instead of working with it.
ROCm is open source, but it’s AMD. Their hardware has historically not been as powerful and therefore attractive to the target audience, so it’s been going slow.
Yes. It could talk to another smart device and ask it to send its packages. You could be careful and connect none of the smart crap in your house to your network, but the smart fridge in your upstairs neighbor’s kitchen could still be helping with smuggling your data out. Or your devices could be connected to some unsecured network around.
In any case, the only surefire way to stop your data from getting smuggled out is to physically kill all the wireless connectivity capabilities of the device. Disconnect antennae, desolder chips, scrape out pcb traces. Otherwise you’re just hoping the firmware is not doing anything funny. Fortunately I think these are all hypotheticals that have not (yet) been observed in real smart home products.
Its niches are nowhere near as strong as reddit though. The only reason I can’t ditch reddit is small hobby subs and stuff like that. Their alternatives on lemmy are just not good enough, because of a hideous combination of lack of users and fragmentation.
Wait, don’t Bluetooth devices randomize their macs like wifi to hide their identities from unpaired devices?
They found a very interesting way of selling their hybrid cars as full on EVs where I live. Their e-power stuff are small ICEs working as generators for electric motors that then drive the wheels. Apparently the fact that the wheels get all their power from an electric motor makes it definitely not a hybrid no sir, despite the fact the cars have tiny ass batteries and the single source of power for the whole system is the ICE. Also they somehow have worse fuel efficiency than many contemporary ICEs that cost quite a bit less. I don’t understand Nissan.
No screaming kids on the flight though. Probably.
Yeah my galaxy watch 6 can go 2 days on a charge. I say “can” because it does depend on usage, but it’s not a rare thing to happen.
Nobody’s gonna abandon cars as a whole over this, the same they wouldn’t abandon bicycles as a whole over some other outrageously monetized luxury feature they could live without.
It’s cockus engORgio, not engorgioh
How is that? Does risc-v have magical properties that make its designers infallible, or somehow make it possible to fix flaws in the physical design after the CPU has already been fabbed and sold?
That… is very naive and inaccurate approach. You can’t use frequency and core counts to guesstimate performance even when the chips in question are closely related. They’re utterly useless when it’s two very different chips that don’t even use the same instruction set. But anyway, there are benchmarks in that page and they clearly show that the amd chip is clearly not performing 9x the operations. It is obviously more powerful, though not nearly by that much.
I desperately want something to start competing with apple silicon, believe me, but knowing just how good the apple silicon chips are from first hand experience, forgive me if I am a little bit sceptical about a little writeup that only deals in benchmark results and official specs. I want to read about how it performs in real life scenarios because I also know from experience that benchmark results and official specs alone don’t always give an accurate picture of how the thing performs in real life.
Am I blind? I don’t see any information in there to draw any conclusions about power efficiency. The little information that I do see actually seems to imply the apple silicon chip would be more efficient. Help me out please?
Never seen tearing look like a cracked mirror.
The license works. Just ask your grandparents who posted those disclaimers on Facebook 15 years ago.
Does the kernel even have that functionality built into it? I thought it only mapped the raw data from the keyboard into actual key presses, but nothing more. That is to say it’s the kernel that determines the ctrl and z keys are being pressed, but it’s something higher on the stack that determines what to do with that information. Could be wrong, though.
They’re probably talking about Samsung TVs, not their android phones/tablets. Installing jellyfin on those things can be a chore. My experience with LG was similar. The official build was out of date and riddled with issues that didn’t exist on other versions. It refused to play videos that worked well enough on other devices, transcode or no.