We’ve built Polo, an open source REST client that runs in your browser. It’s powered by Elixir and Phoenix LiveView, and you can check it out at https://github.com/readyforproduction/polo. In this post, we will discuss some of its key features.

Our goal is to have an application that you can run anywhere, from your local machine to your private servers or in the cloud. It’s also agnostic to the operating system you’re using, all you need is a modern browser and Docker installed.

The main application is a single live view called ClientLive, which acts as a container for the state of the application, and also acts as the manager when a request is sent. It embeds two live components, one for requests and one for responses. Sending requests uses and abuses of the async operations, and responses in the live view are wrapped in the AsyncResult structure.

Both requests and responses have schemas that represent their structures. They use Ecto to cast and validate data, and to embed nested data within them. There are nested schemas in the project for headers, content body, and parameters.

Finch is the default HTTP client of the application, but this can be easily changed since new clients are just implementations of the HTTP behaviour. All you need to do is to create a single callback function according to the specification and that’s it.

Finally, Polo uses NimblePublisher to manage request collections. You describe the attributes using JSON and Markdown, and the application does the rest. This allows you to easily create collections and share them with your peers. There’s also no limit to the number of collections you can run in the application.

Polo in Action

Although we’ve been using patterns found in common Phoenix applications, there are a few topics I’d like to cover in detail. Let’s start with dynamic forms.

Dynamic forms with Ecto and LiveView

One of the requirements is to build a dynamic UI to support multiple headers and parameters for a given request. The first step to accomplish this lives in the request schema.

defmodule Polo.Client.Request do
  use Ecto.Schema
  import Ecto.Changeset

  # ...type declarations and docs

  @valid_methods [:get, :post, :put, :patch, :delete, :options, :head]

  @primary_key {:id, :binary, autogenerate: false}
  embedded_schema do
    field :method, Ecto.Enum, values: @valid_methods
    field :url, :string
    field :sent_at, :naive_datetime
    field :documentation, :binary
    field :collection_name, :string
    field :title, :string

    embeds_many :headers, Polo.Client.Header, on_replace: :delete
    embeds_many :parameters, Polo.Client.Parameter, on_replace: :delete
    embeds_one :body, Polo.Client.Body, on_replace: :delete
  end
end

Note that we use embeds_many/3 to associate a request with headers and parameters. There is also the on_replace: :delete, which means remove the associated data if it’s not present in an update. This option is needed to build the dynamic form.

Next is the changeset, which casts and validates the attributes, and casts the embeds.

def changeset(request, attrs \\ %{}) do
    request
    |> cast(attrs, [:method, :url, :documentation, :id, :collection_name, :title])
    |> validate_required([:method, :url])
    |> validate_inclusion(:method, @valid_methods, message: "must be a valid HTTP method")
    |> validate_length(:title, max: 20)
    |> cast_headers()
    |> cast_parameters()
    |> cast_body()
    # |> more_stuff()
end

defp cast_headers(changeset) do
    cast_embed(changeset, :headers, sort_param: :headers_sort, drop_param: :headers_drop)
end

defp cast_parameters(changeset) do
    cast_embed(changeset, :parameters, sort_param: :parameters_sort, drop_param: :parameters_drop)
end

Pay attention to the sort_param and drop_param options when casting the embeds. According to the Ecto documentation, these two little guys are used for sorting/deleting data from collections, but they can be used to help solve our problem. Focus on the names we use for each option. They are important for the user interface.

Ecto interprets the values in sort_param as a list of item in an embedded collection. A new value in the list means a new item item in the collection. drop_param is the opposite - a value present here means remove this value from the collection.

Now that we have the schema ready, we can go to the live view.

<.inputs_for :let={nested_field} field={@form[:parameters]}>
  <div class="c-embedded-inputs">
    <input type="hidden" name="request[parameters_sort][]" value={nested_field.index} />

    <input
      type="text"
      name={nested_field[:name].name}
      value={Phoenix.HTML.Form.normalize_value("text", nested_field[:name].value)}
      class="c-input"
      placeholder="Name"
      aria-label="Parameter Name"
      phx-debounce="blur"
    />

    <input
      type="text"
      name={nested_field[:value].name}
      value={Phoenix.HTML.Form.normalize_value("text", nested_field[:value].value)}
      class="c-input"
      placeholder="Value"
      aria-label="Parameter Value"
      phx-debounce="blur"
    />

    <label class="c-embedded-inputs__remove">
      <.icon name="hero-trash" />
      <input
        disabled={nested_field.index === 0}
        type="checkbox"
        name="request[parameters_drop][]"
        value={nested_field.index}
        aria-label="Remove"
      />
    </label>
  </div>

  <input type="hidden" name="request[parameters_drop][]" />
</.inputs_for>

<label class="c-embedded-inputs__add">
  <input type="checkbox" name="request[parameters_sort][]" />
  <.icon name="hero-plus-circle" />
  <span>Add new value</span>
</label>

The inputs_for/1 component is the way to go for creating nested forms in the UI. Let’s focus on the elements that use the sort_param and drop_param options.

<input type="hidden" name="request[parameters_sort][]" value={nested_field.index} />

The hidden input ensures that every element in the embeddded collection is present in the request[parameters_sort][] list. Remember that parameters_sort is what we called sort_param in the request schema. This all means that we keep all elements present in parameters_sort.

Now, to remove an item from the collection, we do the opposite:

<label class="c-embedded-inputs__remove">
  <.icon name="hero-trash" />
  <input
    type="checkbox"
    name="request[parameters_drop][]"
    value={nested_field.index}
    aria-label="Remove"
  />
</label>

When the input is checked, the item is added to the request[parameters_drop][] list. parameters_drop is what we called drop_param, so we are basically saying that we want to drop all items that exist in the parameters_drop list.

Adding a new item follows the same idea.

<label class="c-embedded-inputs__add">
  <input type="checkbox" name="request[parameters_sort][]" />
  <.icon name="hero-plus-circle" />
  <span>Add new value</span>
</label>

When the input is checked, the request[parameters_sort][] list is updated with an empty value. Ecto interprets empty values in a sort_param as new items to add. Then, inputs_for/1 takes care of creating an index for the new element.

We also need to keep a hidden input with the drop_param without any values set. This means that we’re explicitly telling Ecto not to remove any items if no drop_param inputs are checked.

<input type="hidden" name="request[parameters_drop][]" />

All of these changes affects the behavior of the inputs_for/1 component. Items in parameters_sort means data is displayed in the UI. Items in the parameters_drop means they are not available in the UI.

The last step is to set up an change event in the form that wraps the inputs_for/1 components. This is done by adding the phx-change="change" binding to the form. The event handler for the change is as follows:

def handle_event(
    "change",
    %{"request" => request_params, "tab" => tab},
    %{assigns: %{request: request}} = socket
  ) do
    # merge current data with new data
    request_params = merge_form_params(socket, request_params)

    changeset =
      request
      |> Client.change_request(request_params)
      # |> more stuff

    socket =
      socket
      |> assign_form(changeset)
      # |> more stuff

    {:noreply, socket}
end

A new changeset is created with the latest data available in the form, and then our UI is updated accordingly. After all these steps, we finally have a dynamic form. Let’s move on to the next topic, the async assigns.

Async assigns

In Polo, sending requests is an async operation. This means that we don’t want to block the UI while waiting for a response to our request. We also need a way to have to handle all the different states that we can have, such as the loading, the error, and of course the one when the request is finished and the response is available. We decided to use the start_async/3 function and the Phoenix.LiveView.AsyncResult module to solve this problem.

There is a submit event handler in the Request live component. The job of this callback is to prepare the data for submission, or to throw an error if something is invalid. If everything works correctly, we notify the parent live view that the request is ready to be sent to its destination.

defmodule PoloWeb.ClientLive.Request do
    def handle_event(
        "submit",
        %{"request" => request_params},
        %{assigns: %{request: request}} = socket
      ) do
        request_params = merge_form_params(socket, request_params)
        changeset = Client.change_request(request, request_params)
        socket = assign_form(socket, changeset)

        case Client.create_request(request, request_params) do
          {:ok, request} ->
            send(self(), {:request_submitted, request})

            {:noreply, socket}

          {:error, error_changeset} ->
            {:noreply, assign_form(socket, error_changeset)}
        end
    end
end

Now, in the ClientLive live view, there’s a callback to handle the event sent by the live component. After a few assignments to the socket, we start our async operation by calling start_async/3. This function wraps our callback (Client.send_request/1) into an asynchronous task, and defines the name of the callback to handle the result of the operation. When an async operation is fired, it runs in a new elixir process, so if something bad happens, our live view will continue to work.

defmodule PoloWeb.ClientLive do
    def handle_info({:request_submitted, request}, socket) do
        socket =
          socket
          |> assign_request(request)
          |> assign_response(AsyncResult.loading())
          |> start_async(:request, fn ->
            Client.send_request(request)
          end)

        {:noreply, socket}
    end
end

In Polo, there are three callbacks configured for the async operation called :request. From their signature, we can understand their purpose. Also note that we use handle_async/3 instead of handle_info/3. This makes very clear that these callbacks are for async operations.

def handle_async(:request, {:ok, {:ok, response}}, socket) do
    socket = assign_response(socket, AsyncResult.ok(response))

    {:noreply, socket}
end

def handle_async(:request, {:ok, {:error, _changeset}}, socket) do
    socket =
      socket
      |> clear_response()
      |> put_flash(:error, "Your request has failed.")

    {:noreply, socket}
end

def handle_async(:request, {:exit, _}, socket) do
    socket =
      socket
      |> clear_response()
      |> put_flash(:error, "Your request has failed.")

    {:noreply, socket}
end

Combined with the async operation, we use the AsyncResult module to get the response data. data. This allows us to track the state of the data according to the state of the operation. Look at how it’s used when the request starts and when it finishes.

defmodule PoloWeb.ClientLive do
    def handle_info({:request_submitted, request}, socket) do
        socket =
          socket
          # |> doesn't matter here
          |> assign_response(AsyncResult.loading())
          |> start_async(:request, fn ->
            Client.send_request(request)
          end)

        {:noreply, socket}
    end

    def handle_async(:request, {:ok, {:ok, response}}, socket) do
        socket = assign_response(socket, AsyncResult.ok(response))

        {:noreply, socket}
    end
end

Viewing the result of an async operation is straightforward. The following is a snippet of code from the response template.

<div :if={@response.loading} class="c-response__loading">
    <p>
      <.spinner class="c-spinner" />
      <span>Loading...</span>
    </p>

    <button class="c-button c-button--tiny" type="button" phx-click="cancel">
      Cancel Request
    </button>
</div>

<div
    :if={@response.result}
    id="response-editor"
    class="c-response__editor"
    phx-hook="ResponseEditor"
    data-content={@response.result.body}
>
    <div
      id="response-editor-container"
      class="c-response__editor-container"
      data-component="response-preview"
    >
    </div>
</div>

To make things even easier, we can use the async_result/1 component, which has predefined slots for errors, loading state, and the actual result.

Managing collections

The most useful feature of a REST client is the ability to save requests in collections and share them with your team. In Polo, you can do this using JSON and Markdown.

The project uses NimblePublisher to manage collections, a minimal file system based publishing engine with Markdown support and code highlighting. I encourage you to read the docs to understand the basics, and I’ll focus on how it’s used in is used in Polo.

{
    "method": "get",
    "url": "https://api.github.com/repos/readyforproduction/polo"
}
---
# Endpoint Documentation:

GET https://api.github.com/repos/readyforproduction/polo

## Description:

This endpoint retrieves information about a specific repository on GitHub. It
requires the owner's username and the repository's name as parameters. The
response includes various details about the repository, such as its name,
description, owner, number of stars, forks count, and more.

# Response:

* name (string): The name of the repository.
* description (string): A brief description of the repository.
* owner (object): Information about the owner of the repository, including username and profile URL.
* stars_count (integer): The number of stars/favorites the repository has received.
* forks_count (integer): The number of times the repository has been forked.
* url (string): The URL of the repository on GitHub.
* created_at (string): The date and time when the repository was created.
* updated_at (string): The date and time when the repository was last updated.
* language (string): The primary programming language used in the repository.

The first part of the file is plain JSON with the attributes from the request schema, and the second part is where you can add documentation for the endpoint, which can be accessed by clicking on the Documentation tab in the UI.

The difference between how NimblePublisher works in general and how it works in Polo is in the way we define the attributes for the module responsible for building the struct. Typically, the way we do this is by using maps instead of plain JSON. Since the latter format is more common, we decided to go with it. For this we need a custom parser:

defmodule Polo.Client.Parser do
  @doc false
  def parse(_path, contents) do
    [attrs, body] = :binary.split(contents, ["\n---\n"])

    # Need to keep the same converter on body.
    # Ref.: https://github.com/dashbitco/nimble_publisher/blob/master/lib/nimble_publisher.ex#L127
    body = Earmark.as_html!(body, %Earmark.Options{})

    attrs =
      attrs
      |> Jason.decode!()
      |> merge_params(%{"documentation" => body})

    {attrs, body}
  end

  defp merge_params(attrs, overrides) do
    %{}
    |> Map.merge(attrs)
    |> Map.merge(%{"id" => generate_request_id(attrs)})
    |> Map.merge(overrides)
  end

  defp generate_request_id(attrs) do
    :crypto.hash(:md5, :erlang.term_to_binary(attrs))
    |> Base.encode64()
    |> String.replace("/", "-")
  end
end

It’s not rocket science. We set a custom hash as the request id, parse the body of the file with Earmark, and merge it into the attributes. Then, in the build phase, all the attributes are available to create a request schema. This is how the NimblePublisher configuration for Polo looks like:

defmodule Polo.Client do
  alias Polo.Client.{HTTP, Parser, Request, Response}

  use NimblePublisher,
    build: Request,
    from: Application.app_dir(:polo, "priv/collections/**/*.md"),
    as: :requests,
    parser: Parser

  # more stuff
end

Now, in the ClientLive live view, we use handle_params/3 to load the request if the request id parameter is available. We use the Polo.Client.get_request/1 to find the request in NimblePublisher collection or create an empty one and display a flash message otherwise.

defmodule PoloWeb.ClientLive do
  @moduledoc """
  LiveView to handle interactions when sending requests.
  """
  alias Phoenix.LiveView.AsyncResult
  use PoloWeb, :live_view

  alias Polo.Client

  def handle_params(
        %{"request_id" => request_id},
        _uri,
        socket
      ) do
    case Client.get_request(request_id) do
      nil ->
        {:noreply,
         socket
         |> put_flash(:error, "Request not found.")
         # |> other assignments
         |> assign_request()
         # |> other assignments
        }

      request ->
        {:noreply,
         socket
         # |> other assignments
         |> assign_request(request)
         # |> other assignments
         |> assign_request_selected(request_id)
        }
    end
  end

The solution we chose to manage collections is not the most robust, but it meets our goals. It provides an easy way to store requests, and and share them with your peers, as you could simply create a GitHub repository with request files and clone it into priv/collections.

There are other interesting stuff in Polo, like the use of behaviours to select which HTTP client to use for requests, CodeMirror integration via hooks, some new features present in CSS that make lifes easier, but I will stop here and let you explore these things for yourself at https://github.com/readyforproduction/polo. We still have a few features to develop before the first release, but it’s in pretty good shape to be used right now.

See you in the next post!