Create a GPT Wrapper

I, for one, am excited to finally lose my job and pivot to my real passion of raising chickens, growing tomatoes, and building my bunker. However, there might still be a chance for you guys to make some big bucks before the mass layoffs.

The new GPT version comes packed with various API enhancements, so it is now easier than ever to become one of the thousands of entrepreneurs developing a Micro SAAS product which is a plain wrapper over Chat GPT.

Trust me, this is easier than you might think, and, to prove it, we’ll build a web app that gives you recipes and nutritional values based on the leftovers you have in your fridge in under 3 minutes.

Let’s make sure we are running on an up-to-date Node locally, and then initialize a Vite project.

$ nvm use v20
$ npm create vite@latest

We’ll use Solid JS to build the UI since it is one of the simplest, yet powerful frameworks these days. Styling wise I’m a fan of working with CSS directly, but we are in a hurry, so we’ll use Tailwind, which is famous for its rapid prototyping capabilities.

In the index file let’s define an App function that will be rendered in the DOM.

const root = document.getElementById('root');
render(() => <App />, root!);

If you are not completely familiar with Solid, don’t worry! This is one of the simplest frameworks out there.

Apps are built with components that have an internal reactive state managed through signals. Whenever the signals are updated, associated effects, including JSX template updates are executed efficiently.

In our case, we’ll define two signals, one for the user input, and one for the GPT response.

function App() {
  const [ingredients, setIngredients] = createSignal("");
  const [recipe, setRecipe] = createSignal({
    preparationMethod: "",
    nutritionalInformations: "",
  });
}

Getting back responses from the Open AI API might take a couple of seconds, so let’s keep track of the loading state as well.

function App() {
  const [ingredients, setIngredients] = createSignal("");
  const [recipe, setRecipe] = createSignal({
    preparationMethod: "",
    nutritionalInformations: "",
  });
  const [loading, setLoading] = createSignal(false);
}

Then, In JSX we’ll add a textarea to capture the user input, a button to submit the form, and a container where the GPT response will be rendered.

function App() {
  const [ingredients, setIngredients] = createSignal("");

  const [recipe, setRecipe] = createSignal({
    preparationMethod: "",
    nutritionalInformations: "",
  });

  const [loading, setLoading] = createSignal(false);

  return (
    <div class="bg-white shadow-md rounded-lg p-8 m-auto max-w-lg">
      <textarea
        value={ingredients()}
        onChange={(ev) => setIngredients(ev.target.value)}
      ></textarea>
      <button onClick={getRecipe} disabled={loading()}>
        Get
      </button>
      {!loading() && recipe().preparationMethod && (
        <>
          <p class="bg-gray-100">{recipe().preparationMethod}</p>
          <p class="bg-gray-100">{recipe().nutritionalInformations}</p>
        </>
      )}
      {/* and a container where the GPT response will be rendered. */}
    </div>
  );
}

When the button is clicked, we’ll make a call to the GPT API.

async function getRecipe() {
  setLoading(true);

  const response = await fetchGPTResponse(ingredients());
  setRecipe(response);
  setLoading(false);
}

<button onClick={getRecipe} disabled={loading()}>
  Get Recipe
</button>;

So, in a new service file I’ll export a fetchResponse function which accepts the prompt as an argument.

We’ll send our request through the Fetch API, attach the authorization header, and define the payload in the body.

export async function fetchGPTResponse (prompt: string) {
  const response await fetch (API_URL, {
    method: "POST",
    headers: { "Authorization": `Bearer ${API_KEY}`},
    body: JSON.stringify({
      model: "gpt-40",
      messages: [
        {
          role: "system",
          content: "You are a helpful assistant designed to output JSON."
        },
        {
          role: "user",
          content:
          Generate a food receipe based on these ingredients: ${prompt}.
          The response must be JSON in the format:
          {
            preparationMethod: string,
            nutritionalInformations: string
          }
        }
      ],
      response_format: {
        type: "json_object"
      }
    }),
  });

  return JSON.parse((await response.json()).choices[0].messages.content);
}

After specifying the model to be used, I’m defining a list of messages. The first one is a system-level instruction to set up the assistant’s behavior or context. The second message is marked as coming from the user and contains the actual prompt to be interpreted by the model.

Finally, I am specifying the desired response format, which means I’ll be able to reliably parse the GPT response, and send it back to my Solid component.

Back in the terminal, we can run our project, and see the results in the browser.

$ npm run dev

All you have to do now is to add in some real functionality, market your product, find some users, monetize them, make the business profitable (because, trust me, those Open AI bills can easily get out of hand), and sell this for an undisclosed outrageous amount of money.

If you feel like you learned something, you should watch some of my youtube videos or subscribe to the newsletter.

Until next time, thank you for reading!