Tweets From Brian Roemmele
Tweets From Brian Roemmele

Tweets From Brian Roemmele

“‘Rather mysterious’ 1,050-year-old structure unearthed in Germany. What is it?” (View Tweet)

PorductGPT. I built a local, private LLM model for a company, a long term client that is 13 million words corpus on—one product. Everything we know from internal financial to technical information. We now have full dialogues with the product and ask ways to improve it. It is…


BOOM! A new system to not only run but train a 20 billion parameter finetune of your data with a 4 bit quantitated model on ultimately an IPhone or Android phone or laptop. Meet: QLORA: Efficient Finetuning of Quantized LLMs [1]. NO MOAT. Testing this now. _ [1]…



“Houseplants can protect you from cancer-causing air pollutants” (View Tweet)

I can say this with absolute certainty. If one is building a #VoiceFirst device—for in-ear use especially, it is of paramount importance to study Muzak human factors research—back to 1922. No one is conducting studies like this today. Know it—or one will reinvent the wheel. (View Tweet)

“The counsel of advisors motif SuperPrompt” I have seen this SuperPrompt change people’s minds, changes lives by showing new ways to look at any problem or situation. This prompt elicits this via a group of counselors that you choose. Master mind this. (View Tweet)

So let me break down what this means in a practical way. Today you will be able to ask GPT4All to do (something) using the LocalDoc plugin and the model’s training, it will propose a solution and in many cases writing an “app” to do it. If you choose yes, the platform will run… (View Tweet)

How good is the open source Orca LLM ChatGPT-like local 13B model? It is surpassing ChatGPT-3.5 turbo. In consumer computers with NO INTERNET and about 8GB hard drive space. These are the benchmarks:


“Home foreclosures are rising nationwide, with Florida, California and Texas in the lead” (View Tweet)

The Diorite Palermo Stone is one of 7 surviving fragments of a stele known as the Royal Annals of the Old Kingdom of Ancient Egypt. It contains a list of the kings from 3150–2283 BC and noted significant events in each year. Likely made in 2290BC. The inscriptions are flawless.


Building a Truly "Open" OpenAI API Server with Open Models Locally—LMSYS Org Join with us. Build local models. (View Tweet)

I am blown away by GorillaLLM! I had it running with an interaction with GPT4All building on demand software to solve any task. Thus far my local system has created 59 separate and connected software applications it is using to solve a very complex problem. Now with…


In the next hour I will be installing the new FinGPT at a Sand Hill Road Venture Capital firm that has been a long term retainer client. The vector database we have will incorporate the entire corpus of pitch decks into the local model. I will only work with one more VC on this.


BOOOM! GPTEngineer Specify what you want it to build, the AI asks for clarification, and then builds it. GPT Engineer is made to be easy to adapt, extend, and make your agent learn how you want your code to look. Entire codebase based on a prompt. (View Tweet)

I took question 77 of The Save Wisdom Project Version 2.0 of questions, recorded a response with a Sony Voice Memo recorder used the open source Whisper software to covert to text. I made it in to a text file for GPT4All Local Doc and I have my wisdom preserved. Save Wisdom. (View Tweet)

“Decorated General says more advanced civilizations are keeping an eye on Planet Earth” (View Tweet)

orca_mini_3b! Yes, 3b and very useful! An OpenLLaMa-3B model trained on explain tuned datasets, created using Instructions and Input from WizardLM, Alpaca & Dolly-V2 datasets and applying Orca Research Paper dataset construction approaches. (View Tweet)

BOOOM! StarCoder coding model from @WizardLM_AI "WizardCoder-15B-v1.0 model achieves 57.3 pass@1 on the HumanEval Benchmarks .. 22.3 points higher than the SOTA open-source Code LLMs." Local private coding LLMs. (View Tweet)

The Linear Magnetic Motor Patent of 2013. (View Tweet)

“LLM Powered Autonomous Agents” Building agents with LLM as its core controller is a cool concept. Several proof-of-concepts demos, such as AutoGPT, GPT-Engineer and BabAGI, serve as inspiring examples. (View Tweet)

“Astronomers reveal evidence of universe's 'background hum'” (View Tweet)

“Brian Roemmele —The Wisdom Keeper | Episode 168” Take 2 and a half hours and hang out with us. Learn why we must, absolutely must save wisdom. (View Tweet)

The first model to beat 100% of ChatGPT-3.5 is here. Meet OpenChat_8192 105.7% of ChatGPT (Vicuna GPT-4 Benchmark) You will own your own local AI. Your data, your AI. (View Tweet)

From the 100s of dumpsters I have dived into, one of the finds were over 2700 Social Guidance Films. I use my AI to transcribe many, and hope to do all soon. This training data formed what I believe is the first true human behavior model in AI. The… (View Tweet)

Spiegel eye roll test. When a person is doing this, the hypnotic operator is watching the upgaze, the rolling of the eyes up toward the top of the head, and then evaluating how much of the sclera, the whites of the eyes, is showing when attempt to close their eyes.


“Scientists Claim That Quantum Theory Proves Consciousness Moves To Another Universe At Death” (View Tweet)

“An AI model has designed a functional computer in five hours” This is rather spectacular news. (View Tweet)

“China's military is leading the world in brain 'neurostrike' weapons: Report” The V2K technology many laughed at even as there were 40+ years of open patents on precisely how they work, now looks naïve. So how is one protected? Yes, Tin Foil Hats. (View Tweet)

BOOM! In just about 48 hours we now have a fully uncensored 4 bit quantized 7B: LLAMA2 This was the fastest model to uncensored in history! Now free open source to use and free to build upon. FREE article soon! (View Tweet)