今年夏季,絕不能錯過名勝壹號世界郵輪重回基隆啟航!多種優惠方案讓您輕鬆預訂心儀的日本沖繩郵輪行程,同時省下大筆開支!

Run LLAMA-v2 chat locally

9 個月前
-
-
(基於 PinQueue 指標)
In this video, I'll show you how you can run llama-v2 13b locally on an ubuntu machine and also on a m1/m2 mac. We will be using llama.cpp for this video.

Please subscribe and like the video to help me keep motivated to make awesome videos like this one. :)

My book, Approaching (Almost) Any Machine Learning problem, is available for free here: https://bit.ly/approachingml

Follow me on:
Twitter: https://twitter.com/abhi1thakur
LinkedIn: https://www.linkedin.com/in/abhi1thakur/
Kaggle: https://kaggle.com/abhishek
-
-
(基於 PinQueue 指標)
0 則留言