• 0 Posts
  • 36 Comments
Joined 8 months ago
cake
Cake day: January 25th, 2024

help-circle
  • One thing I can think of is an overzealous corporate security solution blocking or holding back your email purely for having an attachment, or because it misunderstands/presumes the cipher-looking text file to be an attempt to bypass filtering.

    Other than that might be curious questions from curious receivers of the key/file they may not understand, and will not be expecting. (“What’s this for? Is this part of the contract documents? Oh well, I’ll forward it to the client anyway”)

    Other than that it’s a public key, go for it. Hard (for me anyway) to decide to post them to public keychains when the bot-nets read them for spam, so this might be the next best thing?










  • I mean, it kinda makes sense. Especially in this day and age an appeal is the final say, not the court ruling(feels like everything gets appealed). So, this way the place that happens is the highest court in the state. The final ruling is whether the highest non-appeals court did it right, not the original issue.

    Or, put another way, if you tell me the highest court in the land has made a decision, I would expect that to be the end of it. But it’s not. From the moment the verdict is read lawyers are preparing an appeal. Therefore, whatever court takes the appeal makes the true final decision. Why not then make that the highest court in the land and better reflect the role?



  • Now would be a good time to look for a .com you like, or one of the more common TLDs. And register it at Namecheap, Porkbun, or Cloudflare. (Cloudflare is cheapest but all-eggs-in-one-basket is a concern for some.)

    Sadly, all the cheap or fun TLDs have a habit of being blocked wholesale, either because the cheap ones are overused by bad actors or because corporate IT just blacklists “abnormal” TLDs (or only whitelists the old ones?) because it’s “easy security”.

    Notably, XYZ also does that 1.111B initiative, selling numbered domains for 99¢, further feeding the affordability for bad actors and justifying a flat out sinkhole of the entire TLD.

    I got a three character XYZ to use as a personal link shortener. Half the people I used it with said it was blocked at school or work. My longer COM poses no issue.


  • Some would think this is horrible, but to me, it would be wholly dependent on the title/what was bought and sold.

    Nothing in this world is free. Development, servers, character licensing, it all costs money and if those costs aren’t passed down, you’ll never afford to continue. So for a game, especially one with online content or continuing content, to be free to play, money has to come from somewhere.

    Where the road splits is what is being sold. Things that give an edge in the game, pay-to-win? Uninstalled. Time limited FOMO triggers? Disgusting. Random loot boxes? Begone foul spirit.

    On the other end, if all that is for sale is shiny baubles and trinkets, things no one needs but can have as a reward for “supporting development”? I’m cool with that. If I feel no requirement to pay up, it’s being handled right, and if I like they game, sure, I can part with a fiver to look like I’m dipped in gold or whatever the supporter pack adds to help them keep the lights on(at least until I get bored of it in a week or two and switch back :P).

    I’d be curious what the divide is between the two kinds of purchases are. I’m sure I’ll be disappointed to find it was mostly P2W scum, though.


  • Is there a list anywhere of this and other settings and features that could/should certainly be changed to better Firefox privacy?

    Other than that I’m not sure I’m really going to jump ship. I think I’m getting too old for the “clunkiness” that comes with trying to use third party/self hosted alternatives to replace features that ultimately break the privacy angle, or to add them to barebones privacy focused browsers. Containers and profile/bookmark syncing, for example. But if there’s a list of switches I can flip to turn off the most egregious things, that would be good for today.


  • PassingThrough@lemmy.worldtoMemes@lemmy.mlGet rich quick
    link
    fedilink
    English
    arrow-up
    1
    ·
    3 months ago

    Forgive me, I’m no AI expert to fully compare the needed tokens per second measurement to relate to the average query Siri might handle, but I will say this:

    Even in your article, only the largest model ran at 8/tps, others ran much faster, and none of these were optimized for a task, just benchmarking.

    Would it be impossible for Apple to be running an optimized model specific to expected mobile tasks, and leverage their own hardware more efficiently than we can, to meet their needs?

    I imagine they cut out most worldly knowledge etc/use a lightweight model, which is why there is still a need to link to ChatGPT or Apple for some requests, would this let them trim Siri down to perform well enough on phones for most requests? They also advertised launching AI on M1-2 chip devices, which are not M3-Max either…


  • PassingThrough@lemmy.worldtoMemes@lemmy.mlGet rich quick
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    3 months ago

    Onboard AI chips will allow this to be local.

    Phones do not have the power to ~~~

    Perhaps this is why these features will only be available on iPhone 15 Pro/Max and newer? Gotta have those latest and greatest chips.

    It will be fun to see how it all shakes out. If the AI can’t run most queries on the phone with all this advertising of local processing…there’ll be one hell of a lawsuit coming up.

    EDIT: Finished looking for what I thought I remembered…

    Additionally, Siri has been locally processed since iOS 15.

    https://www.macrumors.com/how-to/use-on-device-siri-iphone-ipad/


  • PassingThrough@lemmy.worldtoMemes@lemmy.mlGet rich quick
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    1
    ·
    3 months ago

    I think there’s a larger picture at play here that is being missed.

    Getting the weather is a standard feature for years now. Nothing AI about it.

    What is “AI” is, Hey Siri, what is the weather at my daughter’s recital coming up?

    The AI processing, calculated on-device if what they claim is true, is:

    1. the determination of who your daughter is
    2. What is a recital? An event? Are there any upcoming calendar events that match this concept?
    3. Is the “daughter” associated with this event by description or invitation? Yes? OK, what’s the address?
    4. Submit zip code of recital calendar event involving the kid to the weather API, and churn out a reply that includes all this information…

    Well {Your phone contact name}, it looks like it will {remote weather response} during your {calendar event from phone} with {daughter from contacts} on {event date}.

    That is the idea between on-device and cloud processing. The phone already has your contacts and calendar and does that work offline rather than educating an online server about your family, events and location, and requests the bare minimum from the internet, in this case nothing more than if you opened the weather app yourself and put in a zip code.


  • Plug it into a monitor or TV and keep an eye on the console.

    I have an older NUC that will not cooperate with certain brands of NVMe drive under PVE…the issue sounds like yours where it would work for an arbitrary amount of time before crashing the file system, attempting to remount read-only and rendering the system inert and unable to handle changes like plugging a monitor in later, yet it would still be “on”.