entry-slick
entry-slick
entry-slick
entry-slick
About Dreamore

Bring your dreams to life with Dreamore, the app that interprets and paints your subconscious thoughts. You can choose from different dream interpreters, such as Freud and Lord Shiva.

Visit Official Website

https://dreamore.app/

Quinn Leng
RT @GregKamradt: Claude 2.1 (200K Tokens) - Pressure Testing Long Context Recall

We all love increasing context lengths - but what's perfo…
Share
Quinn Leng
RT @DrJimFan: Instead of taking OAI's merger offer, Anthropic launched major updates for Claude 2.1🎉. I think the below chart is the most i…
Share
Community Posts
Quinn Leng
RT @GregKamradt: Claude 2.1 (200K Tokens) - Pressure Testing Long Context Recall

We all love increasing context lengths - but what's perfo…
Share
Quinn Leng
RT @DrJimFan: Instead of taking OAI's merger offer, Anthropic launched major updates for Claude 2.1🎉. I think the below chart is the most i…
Share
Quinn Leng
Anthropic is indeed the strongest opponent of OpenAI. Claude 2.1 version has increased to 200k contexts, and also added the ability to use very practical tools and system prompts. At the same time, the accuracy of answers has also been significantly improved. The overall level of the model is a big step closer to OpenAI
-------------
From @Anthropic: Our new model Claude 2.1 offers an industry-leading 200K token context window, a 2x decrease in hallucination rates, system prompts, tool use, and updated pricing.

Claude 2.1 is available over API in our Console, and is powering our chat experience.
Share
Quinn Leng
Another question I have been thinking about in the past two days is how vulnerable humans are in the face of AI. The most promising company in the world was destroyed by three people in a few hours. If humans really become extinct, it's most likely because of our own fault, not because of AGI.
-------------
From @orange.ai: 95% of OpenAI employees choose to follow Sam.
Such a high degree of alignment makes people think about an essential question:
What exactly is OpenAI?
With these people and some money, another GPT4 can be trained in minutes.
The company name doesn’t seem to be that important anymore.
Share
Quinn Leng
The departure of core members and the joint request by most employees to reorganize the board of directors have had a huge impact. Given the decisive resources Microsoft has in it, it’s hard to say who it will favor in the future. I am indeed worried about the impact on OpenAI.
-------------
From @ a new meal: @tinyfool @wshuyi @quinn_leng Yeah. It’s OpenAI that makes people sigh. Don't know what the future will be like. I have many plans related to OpenAI and I don’t know whether I should continue them.
Share
Quinn Leng
OpenAI was founded in a non-profit academic institution. Since its birth, there has always been a huge internal conflict: the academic pursuit of safety, stability, equality and the benefit of all mankind, and being the fastest growing startup mediating between giants. The company needs to release products as quickly as possible, obtain maximum profits, and lock in epic resources for the next stage. It is very easy for the two to conflict violently. The ending this time is indeed a bit sad.
-------------
From @ a new meal: This result makes me a little disappointed. I would rather Sam and Greg go out and start their own company. And I am also very pessimistic and regretful about the future of OpenAI. I am in Mexico now. My wife had diarrhea and ran to the hospital in the early hours of the morning. She relied entirely on my translation and GPT communication. After I came back from the hospital and saw this news, I had mixed feelings. After chatting with @wshuyi, he also had mixed feelings. We were with @tinyfool and @quinn_leng the other day…
Share
Quinn Leng
RT @burkaygur: Live demo of Real-Time Image Generation powered by @fal_ai_data on @huggingface Spaces

Link below
video
00:06
Share
Quinn Leng
Sam Altman left OpenAI. With news like this, it seems like something big has happened to the company board.
image
Share
Quinn Leng
As soon as GPTs came out, there were a lot of people who wanted to make navigation sites. As a GPT developer, I didn’t know which navigation site to publish GPT on. It’s time to make a navigation station of GPT navigation station
-------------
From @Will: The navigation station of GPTs navigation station has been updated, adding short comments, 20+ navigation stations.

Some simple analysis
1. As the number increases, navigation stations must be classified and sorted (scoring/clicks/manual selection)
2. The lag is the most unbearable. A page loads 5000+ and the lag takes 20 seconds to 1 minute. The most important thing about the navigation station is that it is convenient and easy to use
Share
Quinn Leng
OpenAI dev day technology sharing video, highly recommended for LLM developers/enthusiasts to collect: How to maximize LLM performance
-Officially recommended optimization path (optimization context vs optimization model)
- When to use fine-tuning or RAG
- Advantages and disadvantages of RAG and finetuning
A Survey of Tec...
Share
Quinn Leng
NVIDIA releases a new generation of top-level graphics card H200, which doubles the inference speed of H100 and reduces the cost of use by half. The video memory capacity reaches 141GB, and the memory bandwidth is increased by 40%. At this rate of development, GPT-4-turbo is expected to drop in price again within half a year. t.co/OxSYzGDZyq
Share
Quinn Leng
I just added the "Buy the same model" shortcut to "Shop GPT", and it's really good to use. If you add the Amazon affiliate link, I think you can make money with goods, hahahahaha. t.co/LVTnJHYZrj
Share
Quinn Leng
The effect of this GPT is really good. I tried to make a gif of a panda eating watermelon, and the quality of the generated image was very high. The only shortcoming is that the matrix position deviation causes some confusion in the animation. After taking a look at the prompts used by the author, I can see that they are all about the rhythm of writing a complex program in English, which is worth learning. Of course, with this level of customization, GPT builder can't help much, and we mainly rely on handwritten prompts.
-------------
From @宝玉:Gif-PT
Make a gif. Uses Dalle3 to make a spritesheet, then code interpreter to slice it and animate. Includes an automatic refinement and debug mode..


Use Dalle to draw images turning the user request into:
Item assets sprites. In-game sprites
A sprite…
image
Share
Quinn Leng
Friends who have ChatGPT plus, try out the "Rapper Talks Movies" GPT I just developed, and recommend the latest, hottest, and most unprotected Douban movies to you in the form of rap. The only pity is that my unique rapper rhyme design was completely flattened by ChatGPT’s business Taiwanese accent. t.co/TENOkFsSTx
Share
Quinn Leng
The most honest GPT, I asked it to output the image, and it directly typed out the base64 encoding of the image verbatim, and then ruined itself.
image
Share
Quinn Leng
RT @karpathy: LLM OS. Bear with me I'm still cooking.

Specs:
- LLM: OpenAI GPT-4 Turbo 256 core (batch size) processor @ 20Hz (tok/s)
- RA…
Share
Quinn Leng
As an early user of Poe, I witnessed them gradually adding models and functions, and the product form they finally explored was copied and then surpassed by OpenAI GPTs. Starting a company is really difficult, hahahaha. But having said that, the entrepreneurial soil of LLM is still a bit thin at present, and it will start to loosen when the wind blows.
Share
Quinn Leng
RT @dotey: Start a topic, let everyone comment and share useful GPT, please follow the following format:

- Name: XXX
- Description: XXX
- Link: XXX
- Reasons for recommendation
Share
Quinn Leng
GPT-4-turbo 128k context begins to be forgotten after exceeding 64k. Of course, this test question is very simple. Insert a sentence into different positions in a novel to check whether the reply remembers the sentence. I guess if it is a more complex summarizing and categorizing task, forgetting will be more obvious. But 64k is completely sufficient for most scenarios. Long context is really a money-making tool. One dollar can be lost after one prompt.
-------------
From @Greg Kamradt:Pressure Testing GPT-4-128K With Long Context Recall

128K tokens of context is awesome - but what's performance like?

I wanted to find out so I did a “needle in a haystack” analysis

Some expected (and unexpected) results

Here's what I found:

Findings:
* GPT-4's recall…
Share
Quinn Leng
I tried it out, and the multi-language effect is very good. It is currently the most natural TTS service for speaking Chinese.
-------------
From @深圳皇女! ! :How many people haven’t tried OpenAI’s newly released TTS text-to-speech?

There is an online version here. You can enter a piece of text directly on the web page. There are 5 male voices and 2 female voices that can be switched. No installation or deployment is required. The key is it is free.

I tried out Chinese tongue twisters and used my favorite nova voice.
Share