• 19 Posts
  • 74 Comments
Joined 3 months ago
cake
Cake day: December 26th, 2024

help-circle





  • llama@lemmy.dbzer0.comOPtoPrivacy@lemmy.mlHow to run LLaMA (and other LLMs) on Android.
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    edit-2
    1 month ago

    This is all very nuanced and there isn’t a clear cut answer. It really depends on what you’re running, for how long you’re running it, your device specs, etc. The LLMs I mentioned in the post did just fine and did not cause any overheating if not used for extended periods of time. You absolutely can run a SMALL LLM and not fry your processor if you don’t overdo it. Even then, I find it extremely unlikely that you’re going to cause permanent damage to your hardware components.

    Of course that is something to be mindful of, but that’s not what the person in the original comment said. It does run, but you need to be aware of the limitations and potential consequences. That goes without saying, though.

    Just don’t overdo it. Or do, but the worst thing that will happen is your phone getting hella hot and shutting down.







  • I am not entirely sure, to be completely honest. In my experience, it is very little but it varies too. It really depends on how many people connect, for how long they connect, etc. If you have limited upload speeds, maybe it wouldn’t be a great idea to run it in your browser/phone. Maybe try running it directly on your computer using the -capacity flag?

    I haven’t been able to find any specific numbers either, but I did find a post on the Tor Forum dated April 2023 or a user complaining about high bandwidth usage. This is not the norm in my experience, though.




  • You are completely right. That was worded poorly and a few users have thankfully pointed that out. The answer, for most people, is yes. But that depends entirely on your threat model.

    The traffic to your Snowflake proxy isn’t necessarily from people in ‘adversarial countries’. A Snowflake proxy is a type of bridge, so just about anyone can use it. You can use a Snowflake bridge, if you want. However, it is strongly encouraged to save bridges (including Snowflakes) to people who need them.

    So, for most people, it is generally safe to run Snowflake proxies. Theoretically, your ISP will be able to know that there are connections being made there, but, to them, it will look like you’re calling someone on, say, Zoom since it uses WebRTC technology. They can’t see the data, though since everything is encrypted (check the Snowflake docs and Tor Brower’s for further reference). You probably won’t get in any trouble for that.

    Historically, as far as we know, there haven’t been any cases of people getting in legal trouble for running entry relays, middle relays, or bridges. There have a been a few cases of people running exit nodes and getting in trouble with law enforcement, but none of them have been arrested or prosecuted so far.

    If you know of any, let me know.


  • I have not used AI to write the post. I used Claude to refine it because English is not my first language. If there are any errors, that is my bad. Please point them out as you did so I can fix them.

    This has several errors including the fact that running the proxy exposes your IP address.

    Thank you for pointing that out. That was worded pretty badly. I corrected it in the post.

    For further clarification:

    The person who is connecting to your Snowflake bridge is connecting to it in a p2p like connection. So, the person does know what your IP address is, and your ISP also knows that the person’s IP address is – the one that is connecting to your bridge.

    However, to both of your ISPs, it will look like both of you are using some kind of video conferencing software, such as Zoom due to Snowflake using WebRTC technology, making your traffic inconspicuous and obfuscating to both of your ISPs what’s actually going on.

    To most people, that is not something of concern. But, ultimately, that comes down to your threat model. Historically, there haven’t any cases of people running bridges or entry and middle relays and getting in trouble with law enforcement.

    So, will you get in any trouble for running a Snowflake bridge? The answer is quite probably no.

    For clarification, you’re not acting as an exit node if you’re running a snowflake proxy. Please, check Tor’s documentation and Snowflake’s documentation.



  • Thank you for pointing that out. That was worded pretty badly. I corrected it in the post.

    For further clarification:

    The person who is connecting to your Snowflake bridge is connecting to it in a p2p like connection. So, the person does know what your IP address is, and your ISP also knows that the person’s IP address is – the one that is connecting to your bridge.

    However, to both of your ISPs, it will look like both of you are using some kind of video conferencing software, such as Zoom due to Snowflake using WebRTC technology, making your traffic inconspicuous and obfuscating to both of your ISPs what’s actually going on.

    To most people, that is not something of concern. But, ultimately, that comes down to your threat model. Historically, there haven’t any cases of people running bridges or entry and middle relays and getting in trouble with law enforcement.

    So, will you get in any trouble for running a Snowflake bridge? The answer is quite probably no.

    For clarification, you’re not acting as an exit node if you’re running a snowflake proxy. Please, check Tor’s documentation and Snowflake’s documentation.














  • Though apparently I didn’t need step 6 as it started running after I downloaded it

    Hahahha. It really is a little redundant, now that you mention it. I’ll remove it from the post. Thank you!

    Good fun. Got me interested in running local LLM for the first time.

    I’m very happy to hear my post motivated you to run an LLM locally for the first time! Did you manage to run any other models? How was your experience? Let us know!

    What type of performance increase should I expect when I spin this up on my 3070 ti?

    That really depends on the model, to be completely honest. Make sure to check the model requirements. For llama3.2:2b you can expect a significant performance increase, at least.