Submitted by rumovoice t3_11hscl1 in MachineLearning
Comments
kekinor t1_jayxe3j wrote
There is also plz, which seems more mature:
rumovoice OP t1_jaz2inw wrote
shell_gpt seems more mature than that
https://github.com/TheR1D/shell_gpt
Both of those are separate scripts rather than hotkey bindings that work inline in your shell.
kekinor t1_jb09s2y wrote
Might be that I misunderstood then. Thanks for pointing out the differences and shell_gpt.
[deleted] t1_jav5pu1 wrote
What dataset did you use to train the model? I'm creating something similar for an app and looking for a dataset.
Edit: NVM, you are using OpenAI API.
ginger_beer_m t1_jawngph wrote
OpenAI .. LOL
Quazar_omega t1_jax5sju wrote
I swear, sooner or later they'll change name into some dystopian stuff like EthicalAI or something since they aren't much open anymore, but still want to keep a "good" face
ddproxy t1_jaxju3a wrote
Ministry of Truth
ddproxy t1_jaxkegs wrote
I regret this immediately.
Quazar_omega t1_jaxnwrk wrote
Too late
> I regret this immediately.
This statement is false.
- ChatGPT
Sirisian t1_jayfok9 wrote
They bought https://ai.com the other day if you missed that. It directs to ChatGPT for now.
Quazar_omega t1_jb034l2 wrote
Oh, wow, now THAT is self centered!
2blazen t1_jb1jk5h wrote
And lazy
Like come on at least have a landing page
Quazar_omega t1_jb1uf1u wrote
Lmao yeah, good point
chic_luke t1_jaxhi06 wrote
"Open"AI
visarga t1_jaw9d2e wrote
ALL the data
ank_itsharma t1_jax7x0s wrote
On a similar context, we can fine tune OpenAI API for a particular set of data, right?
[deleted] t1_jaxc6sf wrote
Yeah, I think so.
DaTaha t1_jawprrx wrote
What options does one have if one wants ChatGPT-like functionality but without actually reaching out to OpenAI or other such online services?
crayphor t1_jawrahh wrote
You can use a smaller model like GPT-2. You are not going to get ChatGPT performance without a terabyte of VRAM, but if you want to try something locally, GPT-2 exists.
Art10001 t1_jayfgsc wrote
I suggest another model such as OPT or even Flan-T5, because they're much easier to setup than OAI's outdated instructions that use outdated package versions that effectively demand a for-purpose VM or Docker.
Pat3418 t1_jbjmcty wrote
Have you done something like this? Any references you can share? Would love to have an offline tool like this…
Art10001 t1_jbjv13z wrote
I once setup the smallest model of Pygmalion to act as a chatbot. The UI was premade and Gradio-based.
It worked well. I had tried Flan-T5 first, but the UI did not recognize the model kind (yet.)
DAlmighty t1_javrbwq wrote
This is definitely impressive but it also feels like more work than to just bang out the commands.
rumovoice OP t1_javxdca wrote
If you know and remember the command - yes, if you need to google or read the man first this could be faster. Or sometimes faster for complex commands with subshells and regexes.
DAlmighty t1_jaw5m9v wrote
Regexes are definitely the bane of my existence.
Trotskyist t1_jaxyg2b wrote
Been using regex's regularly for like 10 years now...still don't know how the hell to write them. Shameful, I know.
Shout out to some random guy named Olaf Neumann, without whom I'd be screwed: https://regex-generator.olafneumann.org/
DAlmighty t1_jaxzv4n wrote
That’s how I feel about https://regexr.com/. Without it, I’d be lost as well so don’t feel too bad.
hiptobecubic t1_jaw1yhv wrote
The ffmpeg example is worth it alone
n8mo t1_jawxx5t wrote
Not me googling the syntax for ffmpeg commands every time I have to use it LMFAO
maxToTheJ t1_jaw2nk0 wrote
Especially given the compute used on OpenAI end. This isn’t sustainable
[deleted] t1_jaw4rqp wrote
[deleted]
omgpop t1_jav1omw wrote
Does it/could it send your directory/file tree as part of the prompt?
notaninvestor633 t1_javmt3c wrote
Wow this is awesome. Can’t wait to tinker around with it
BeautifulLurker t1_jax84v8 wrote
Could you DM me that .tgz? You see, I'm Satoshi and have been looking for that file for a while.
Art10001 t1_jayfsfe wrote
I've seen DM like 5 times in recent memory...
In reddit we use PM, Private Message. Not DM, Direct Message, which is a Discordism.
Bentastico t1_jazub0z wrote
bro relax 😭
WarAndGeese t1_jax2je6 wrote
I imagine that stuff like this will be the future of interacting with computers, at least to a large extent, but it's frustrating how people sacrifice certainty for 'the probability of it being right are good enough'.
WarAndGeese t1_jax2k9d wrote
I think I would prefer that ends up not being the case, but I can see the trajectory of how it would be.
RoninUTA t1_javs4zg wrote
Impressive!
eclipsejki t1_jb0tmeq wrote
this is dangerous on so many levels. External API calls has access to your entire computer. I'd wait for smaller personal LLM
possibilistic t1_javklac wrote
Is this a Super Mario RPG reference?
MuonManLaserJab t1_jaw7lmj wrote
You're thinking of that other library, Keraskeras Cola.
Ye1488 t1_jawh7km wrote
boomer
Supaguccimayne t1_jb0lees wrote
Im right in the middle of being a millennial and played super mario rpg
BezoomnyBrat t1_jaw7k23 wrote
Looks great, definitely going to try it out. Pity it works only with zsh though and not with bash
rumovoice OP t1_jawfmk1 wrote
I'm not sure if bash has autocomplete capabilities like that (like asking for a query under the current command line)
ke7cfn t1_jazjfwv wrote
Can it download and build sshfs for an m1 mac by query ?
rumovoice OP t1_jazl88i wrote
> download and build sshfs for an m1 mac
its answer: git clone https://github.com/osxfuse/sshfs.git && cd sshfs && ./autogen.sh && ./configure && make && sudo make install
it doesn't do well in cases where it needs some recent knowledge like m1 issues
ke7cfn t1_jazlzk6 wrote
Looks like here's what needs to be done:
https://www.reddit.com/r/macapps/comments/lea865/how_to_install_sshfs_on_big_sur/
Zieng t1_jax04t4 wrote
I'll try it out! but how is the API availability? bc the availability on the chatbot at least is too low, for free tier :(
rumovoice OP t1_jauxt0h wrote
https://github.com/not-poma/lazyshell
A smart autocomplete script invoked with ALT+G. It can modify the existing command as well.