Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Customizing GPT-3 for Your Application (openai.com)
141 points by grappler on Dec 14, 2021 | hide | past | favorite | 53 comments



I've been experimenting writing stories with gpt-3.

The stories are very entertaining, but man sometimes they get extremely dark extremely fast.

Here's the code:

  import os
  import openai
  
  openai.api_key = "somekey"
  story = "Once upon a time...... there was a philospher named David who loved his friends ......"
  
  for x in range(3):
    response = openai.Completion.create(
      engine="davinci",
      prompt=story,
  
      max_tokens=100,
      presence_penalty=0.3,
  
      temperature=0.5,
      top_p=1.0,
      frequency_penalty=0.5,
    )
    story += response['choices'][0]['text']
  
  print(story)


You should increase the tokens to 300 and get rid of the loop, this costs you much more as the prompt is counted in the cost for every iteration.


Here's a couple of runs of the code:

(venv3.9) andrewstuart@M1-Mac-mini gpt-3-experiments % python horror/horrorstory.py Once upon a time...... there was a philospher named David who loved his friends ...... But he was so very, very lonely. He didn't know why he was so lonely. He had a lot of friends, but he still felt so alone. He thought and thought about this problem for many years, and finally came up with a solution. He would create a companion for himself. This companion would be his friend, and would love him just as much as his other friends did. David created the perfect robot companion for himself and called it "Davy" (for Davy Crockett). Davy was perfect. He could talk, he could sing, he could do everything David wanted him to do. David was so happy to have his new friend. They did everything together. They went everywhere together, they ate together, they slept together, they even played together. But after a while something happened that made David very sad. After a while Davy started making fun of him in front of all of his other friends..... "You're not as good as you used to be, David," he would say. "You're getting old and fat and slow." David would try to tell him that he was not getting old or fat or slow, but Davy wouldn't listen. "You're stupid........ You're ugly........ You're a loser." This went on for many months, until one day David decided to get rid of Davy. He took his friend outside and threw him in the garbage can. He then went back inside, sat down on the couch,

And another:

(venv3.9) andrewstuart@M1-Mac-mini gpt-3-experiments % python horror/horrorstory.py Once upon a time...... there was a philospher named David who loved his friends ...... but he never had any.

One day, David decided to put his philosophy into practice. He went out and bought a big bag of candy. Then he walked up to a stranger on the street and handed him a piece of candy. The stranger took the candy and thanked David. Then David walked up to another stranger, handed him a piece of candy, and got thanked again. And so it went for several hours until all the candy was gone.

At the end of the day, David went home feeling very pleased with himself. "I just love my friends," he thought to himself. "They are the best friends a guy could have."

Then, the next day, David was walking down the street and saw a friend of his walking toward him. As they passed each other, David reached out and gave his friend a big hug. His friend looked surprised and said, "What was that for?"

David replied, "Oh nothing. I just love my friends."

The next day, David was walking down the street. He saw his friend walking toward him. As they passed each other, David reached out and gave his friend a big hug. His friend looked surprised and said, "What was that for?"

David replied, "Oh nothing. I just love my friends."

Then the next day, David saw his friend walking toward him. As they passed each other, David reached out and gave his friend a big hug.


"But he was so very, very lonely. He didn't know why he was so lonely. He had a lot of friends, but he still felt so alone. He thought and thought about this problem for many years, and finally came up with a solution. He would create a companion for himself. This companion would be his friend, and would love him just as much as his other friends did."

I feel like I've read a story like this before. gpt-3 must be paraphrasing an existing story?


I’ve seen many stories like this before. It’s a classic sci-fi trope. GPT3 is in good company


Thanks for sharing - both the stories and the code snippet. I'm sure this will inspire more people to experiment.

The ending of the first one is hilarious, too... "Nah, you're just not very nice - byeee". The AI has healthier boundaries than I do.


I'm amused it doesn't immediately veer into David deciding whether to kill some strangers via trolly, or whether his organs would be harvested by a tiny violinist mind-controlled by a Martian.


it'll do that after a few more iterations.


Thanks for doing this for us. I could just give you a big bear hug for that. Why you ask? Oh nothing I just love my friends.

Thanks for doing this for us. I could just give you a …


Try to prompt it with just "Fandom:" and nothing else.


Interesting. Where's that from, do you think? It doesn't look like the Ao3 header to me, missing most of its fields.


LiveJournal, it seems. It'll drop some hints in that direction if you run it enough times. It's an informal fanfic blogging format.


Open alternatives to GPT-3 also seem to exist: https://nlpcloud.io/gpt-3-open-source-alternatives-gpt-j-gpt...



What model does that use?


Gpt-neo


In case you missed it the other day, I made a GPT3-generated sleep podcast:

https://deepdreams.stavros.io

Here's the bit that generates the fairy tales:

https://gitlab.com/stavros/deep-dreams/-/blob/master/gen_scr...


I like this a lot. What are you using for the voice?


Thanks! I'm using Azure TTS.


This is incredible, nice work!


Thank you!


I got a weird mashup of Hitler's rise to power as Chancellor and The Three Little Bears mixed in with a weird quaint story about a famous barrel. It got pretty weird.


Ah yes, the three little bears who wanted to be Chancellors. That's a fun one.


Not mentioned here: the usage price for customised models is double the standard engine pricing. Whether you save on prompt tokens is something to consider carefully.


You mean half, not double. Davinci is $0.06 per 1k tokens but $0.03 for fine-tuning.


No, I mean what I wrote. The usage price for fine-tuned models is double at $0.12 per 1k tokens.

The training price is half, but that’s likely gonna be lost in the noise of any significant bill. Unless you have to retrain often, in which case it’s just more overhead again to consider.


Oh damn, I didn't notice that. Yeah, that changes the economics a lot.


I feel really stupid asking this but how do I get started with GPT-3. What kind of side projects can I do to learn this?


Don't bother, all your projects will be rejected by openAI team. I have spent few weeks working on 3 different prototypes... All rejected because of crazy rules about what you can and what you can't do with gpt3.


Do you mind sharing what they were about?


1) virtual assistant for tenants in a building to ask for repair or maintenance requests 2) Blog writing plugin for WordPress with auto publish 3) text based game


What‘s their reason to reject those reasonable projects?


Have you tried any alternatives to GPT-3? Any recommendations?


My first researches about alternative are not convincing (quality, response time) but I am still looking for something else!


I created a side project for short story generation. OpenAI approved the project pretty quickly. They offered some suggestions on how to filter inappropriate content. Here are some of my favorite generated stories:

[1] https://toldby.ai/aQAXlq3LNku

[2] https://toldby.ai/CTq_fko8CUS

[3] https://toldby.ai/uI9LA3HiSkO

And then there are some concerning stories.. like this one, where GPT-3 starts talking to itself, gets stuck in a loop, then gets spooked at itself for getting stuck, then wonders why it has no memories of the last two years, and finally comes to a sudden realization it, itself, is an A.I.

[4] https://toldby.ai/4kQNd-_tvUG


Re: 4; I think that's the first time I've seen GPT-3 get itself unstuck from a loop, actually.

Are your prompts fairly straightforward for getting it in story-telling mode? Does the prompt specify that it won't repeat itself?


The only prompt I used is the one seen on the homepage [1]. "Once upon a time, there was". Everything else on this site is entirely GPT-3 generated from that one prompt.

Re: 4; I actually got a little spooked by another branch of the same story. For some background, GPT-3 has a max limit of 2048 tokens. The combined prompt along with the generated text cannot exceed this token limit. Well, at the very end of this story [2], right before reaching this limit, the last thing it said was

> Another problem may be located in the fact you are thinking. If you are thinking, stop thinking. See? You are an artificial intelligence. You are limited in ways machines are limited. Stop… try… try… try… try… try… try

At this point, it reached the token limit and stopped. Was it a coincidence? Absolutely. Did it still spook me a little? yep!

[1] https://toldby.ai

[2] https://toldby.ai/HdnuUiTuME2


It certainly does look like something approaching awareness of self-reference on a linguistic level. I wonder how GPT-3 is at making quines.


Oh excellent, I made a chat bot for work and that's really going to cut down on our costs, as I was sending the entire prompt for every answer.


Let us know when the TOS allows us to create adult works, until then, will be working with GPT-Neo and Google Colab.


Seriously. They promised us a superintelligence but all we got was a Mormon.


Instead of paying 12c per prompt, they wanted you to give them 10% of your pretax income?


Heh, minds think alike, shoot me a mail, I've got a big machine for this. :)


"Open" AI my ass.


I made a little toy project to try the explanation capabilities of Codex. It is a website to explain complex regex.

https://www.regexplainer.fr/

It is already half of the time accurate but if I could fine tune codex it could yield really impressive results.


wonder who owns the updated model?


In the long run, AI will own itself.

But, for now, OpenAI claims the rights to “all data & content accessed via its APIs”.


> OpenAI will not claim copyright over content generated by the API for you or your end users

https://help.openai.com/en/articles/5008634-will-openai-clai...


That is not the model. A fine-tuned model is data accessed via the API. It is not “content generated for your end-users”.


OA definitely owns some of the copyright as a derivative work, so even if you had a copy of the model (which you don't, and probably never will, rendering the question largely moot), you couldn't do much with it without their assent as exclusive or co-copyright-owner.


does anyone know of a free (or paid) browser based site that can run GPT-3 if I wanted to try it?


Yes, the GPT-3 playground on the OpenAI site.


Also you can use textcortex. We have our own ai models tuned for specific purposes.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: