• 3 Posts
  • 10 Comments
Joined 1 year ago
cake
Cake day: August 28th, 2023

help-circle
  • When you consider the price of a used android (ie. Oneplus 6T for $80 on ebay) and compare it spec for spec with a raspberry pi, it’s actually a really good deal. Like you get:

    • Built in backup power supply (battery)
    • 8-core power-efficient CPU (SDM845)
    • Embedded sensors (microphone, magnetometer, gyro)

    The way I set mine up is to run the server directly on Android using Termux, having an app autostart Termux on boot, and making sure to disable battery optimizations on the app. And then I just had the phone always plugged into the outlet to maintain the battery (and of course android would just trickle charge / disable once full charged).

    Of course this isn’t perfect because you still have much more variability in play (at the OS level) than an RPi (along with not having a standard environment like debian unless you use proot), but it overall is a very powerful setup that works quite well.



  • I respectfully disagree. Any high quality creator is tangibly penalized by YouTube’s recommendation algorithm for not optimizing their titles and thumbnails. A rare few choose to take this penalty but I don’t blame the many quality creators who choose to take part in the game that YouTube has made for everyone.

    Yes, the alternate titles may not be perfect, but I’d take any random person’s attempt at a title over the hyper optimized ones any day because I’d rather make an informed decision to watch something even if there is some degree of inaccuracy than to make a completely uninformed decision based on what an algorithm predicted would most likely get me to click and get hooked on a video irregardless of my own will and whether I am satisfied at the end of watching it.


  • Sure, to be pedantic, I could clarify: “I think the fediverse will realistically never gain mainstream adoption without a large organization with either a massive existing userbase or the ability to invest in large organized marketing efforts.”

    This could be technically through some Fediverse collective that receives a large amount of donations, but I don’t see this as very likely to happen and even with organized marketing efforts there’s no guarantee of effectively converting this into adoption.


  • Unpopular opinion: Threads deepening ties to the fediverse is actually a really good thing for the fediverse as a whole.

    I feel like realistically the fediverse will never gain mainstream adoption on its own. People like to believe in this beautiful future where the fediverse “wins out” and beats all the major social media networks, but I just don’t see this happening. This is why I think Threads is actually really important for the growth of the fediverse and realistically one of the only paths to broad adoption.

    Beyond this, I also separately really like the idea of being able to use a platform like Threads with my irl friends while still having access to open source clients etc. (ie. preventing situations like the Twitter API debacle which fucked over 3rd party clients)




  • I installed Haiku on this Laptop from 1999 once since it was actually the only non-windows OS I could get to run for some reason. Video driver was bugged tho so the screen was visually offset by ~100 pixels which made it too hard to use. Otherwise though it ran at a bearable normal speed which is a huge feat for the something like 500MHz processor and 500MB/1GB of RAM (I forget the exact specs).


  • Sorry but has anyone in this thread actually tried running local LLMs on CPU? You can easily run a 7B model at varying levels of quantization (ie. 5 bit quantization) and get a generalized prompt-able LLM. Yeah, of course it’s going to take ~4GB of RAM (which is mem-mapped and paged into memory), but you can easily fine tune smaller more specific models (like the translation one mentioned above) and have surprising intelligence at a fraction of the resources.

    Take, for example, phi-2 which performs as well as 13B param models but with 2.7B params. Yeah, that’s still going to take 1.5GB RAM which Firefox wouldn’t reasonably ship, but many lighter weight specialized tasks could easily use something like a fine tuned 0.3B model with quantization.



  • __matthew__@lemmy.worldtolinuxmemes@lemmy.worldLinus does not fuck around
    link
    fedilink
    arrow-up
    36
    arrow-down
    1
    ·
    edit-2
    11 months ago

    While Linus went overboard (as he has a history of doing, and as has also caused negativity to the community), this post is still very well liked because it appears to be a strong example of someone calling out the BS that a lot of developers like to throw around. No one’s going to join in a circle celebrating Linus picking on some first time contributor who didn’t know any better, but that’s how it sounds like you’re interpreting the post.

    To add some context, there’s a toxic superiority complex that many developers have where they jump to blame others for issues that actually relate to their code. You can see this anywhere from developers who immediately blame users without investigating to software developers within companies who are quick to pass off issues as not their team’s problem.

    So, in this example Linus is actually calling one of these developers out, which is why the post is very well-received.