• 30 Posts
  • 77 Comments
Joined 1 year ago
cake
Cake day: July 1st, 2023

help-circle
  • It’s actually not. Abstracts are targeted at academics or researchers, and oftentime preserve the complexity. Take for example the abstract of the paper this video’s about:

    Reported here are experiments that show that ribonucleoside triphosphates are converted to polyribonucleic acid when incubated with rock glasses similar to those likely present 4.3–4.4 billion years ago on the Hadean Earth surface, where they were formed by impacts and volcanism. This polyribonucleic acid averages 100–300 nucleotides in length, with a substantial fraction of 3′,-5′-dinucleotide linkages. Chemical analyses, including classical methods that were used to prove the structure of natural RNA, establish a polyribonucleic acid structure for these products. The polyribonucleic acid accumulated and was stable for months, with a synthesis rate of 2 × 10−3 pmoles of triphosphate polymerized each hour per gram of glass (25°C, pH 7.5). These results suggest that polyribonucleotides were available to Hadean environments if triphosphates were. As many proposals are emerging describing how triphosphates might have been made on the Hadean Earth, the process observed here offers an important missing step in models for the prebiotic synthesis of RNA.

    While it is less complex than the paper, it is nevertheless dense and jargon endowed. Your average person with a highschool education will either not understand it well or be absolutely turned off by its density. They’re also just very unlikely to stumble across it.

    I could have the machine reword it, but the information is not comprehensive, which reduces quality. By having the entire paper in its context window, the LLM is less likely to hallucinatinate. Plus the added information helps it make better summaries based on all the paper’s sections, importantly the limitation section.
















  • If you’re using ChatGPT, you’re using an outdated model. GPT-4 is better than it and has been out for some time.

    ChatGPT is weaker and not something I’d use frequently. GPT-4 is much stronger and much more useful. And the next generation is coming soon, which will be better than GPT-4.

    It’s not like fusion in its current state since LLM AIs are already ready for use. The only task is making it more effective. Using the fusion example, it would be like if we’d finally developed a reactor that generates more energy than it consumed, and now only sought to make it create more power









  • Hey, I recognize you from this comment! You flipped that switch so many decades ago, ruining everything I had worked so hard for. I’ll always remember.

    Those lost 50KB of work will forever be etched into my mind. Quite literally: the second I get my hands on a 30TB neurolink you bet your goddam ass I’m making a 50KB text file with your name on repeat, so that I’ll always hear your name echo in my thoughts. “u/Kalkaline@programming.dev flipped my surge protector’s switch”, for x in range infinity