tapir, or Typed API descRiptions¶
With tapir you can describe HTTP API endpoints as immutable Scala values. Each endpoint can contain a number of input parameters, error-output parameters, and normal-output parameters. An endpoint specification can be interpreted as:
- a server, given the “business logic”: a function, which computes output parameters based on input parameters. Currently supported:
- a client, which is a function from input parameters to output parameters. Currently supported: sttp.
- documentation. Currently supported: OpenAPI.
Tapir is licensed under Apache2, the source code is available of GitHub.
Contents¶
Quickstart¶
To use tapir, add the following dependency to your project:
"com.softwaremill.tapir" %% "tapir-core" % "0.7.10"
This will import only the core classes needed to create endpoint descriptions. To generate a server or a client, you will need to add further dependencies.
Most of tapir functionalities are grouped package objects which provide builder and extensions methods, hence it’s easiest to work with tapir if you import whole packages, e.g.:
import tapir._
If you don’t have it already, you’ll also need partial unification enabled in the compiler (alternatively, you’ll need to manually provide type arguments in some cases). In sbt, this is:
scalacOptions += "-Ypartial-unification"
Finally, type:
endpoint.
and see where auto-complete gets you!
Example usages¶
To see an example project using Tapir, check out this Todo-Backend using tapir and http4s.
Also check out the simple runnable example which is available in the repository.
Goals of the project¶
- programmer-friendly, human-comprehensible types, that you are not afraid to write down
- (also inferencable by IntelliJ)
- discoverable API through standard auto-complete
- separate “business logic” from endpoint definition & documentation
- as simple as possible to generate a server, client & docs
- based purely on case class-based, immutable and reusable data structures
- first-class OpenAPI support. Provide as much or as little detail as needed.
- reasonably type safe: only, and as much types to safely generate the server/client/docs
Anatomy an endpoint¶
An endpoint is represented as a value of type Endpoint[I, E, O, S]
, where:
I
is the type of the input parametersE
is the type of the error-output parametersO
is the type of the output parametersS
is the type of streams that are used by the endpoint’s inputs/outputs
Input/output parameters (I
, E
and O
) can be:
- of type
Unit
, when there’s no input/ouput of the given type - a single type
- a tuple of types
Hence, an empty, initial endpoint (tapir.endpoint
), with no inputs and no outputs, from which all other endpoints are
derived has the type:
val endpoint: Endpoint[Unit, Unit, Unit, Nothing] = ...
An endpoint which accepts two parameters of types UUID
and Int
, upon error returns a String
, and on normal
completion returns a User
, would have the type:
Endpoint[(UUID, Int), String, User, Nothing]
You can think of an endpoint as a function, which takes input parameters of type I
and returns a result of type
Either[E, O]
, where inputs or outputs can contain streaming bodies of type S
.
Defining an endpoint¶
The description of an endpoint is an immutable case class, which includes a number of methods:
- the
name
,description
, etc. methods allow modifying the endpoint information, which will then be included in the endpoint documentation - the
get
,post
etc. methods specify the HTTP method which the endpoint should support - the
in
,errorOut
andout
methods allow adding a new input/output parameter mapIn
,mapInTo
, … methods allow mapping the current input/output parameters to another value or to a case class
An important note on mapping: in tapir, all mappings are bi-directional. That’s because each mapping can be used to generate a server or a client, as well as in many cases can be used both for input and for output.
Next¶
Read on about describing endpoint inputs/outputs.
Defining endpoint’s input/output¶
An input is described by an instance of the EndpointInput
trait, and an output by an instance of the EndpointOutput
trait. Some inputs can be used both as inputs and outputs; then, they additionally implement the EndpointIO
trait.
Each input or output can yield/accept a value (but doesn’t have to).
For example, query[Int]("age"): EndpointInput[Int]
describes an input, which is the age
parameter from the URI’s
query, and which should be coded (using the string-to-integer codec) as an Int
.
The tapir
package contains a number of convenience methods to define an input or an output for an endpoint.
For inputs, these are:
path[T]
, which captures a path segment as an input parameter of typeT
- any string, which will be implicitly converted to a constant path segment. Path segments can be combined with the
/
method, and don’t map to any values (have typeEndpointInput[Unit]
) paths
, which maps to the whole remaining path as aSeq[String]
query[T](name)
captures a query parameter with the given namequeryParams
captures all query parameters, represented asMultiQueryParams
cookie[T](name)
captures a cookie from theCookie
header with the given nameextractFromRequest
extracts a value from the request. This input is only used by server interpreters, ignored by documentation interpreters. Client interpreters ignore the provided value.
For both inputs/outputs:
header[T](name)
captures a header with the given nameheaders
captures all headers, represented asSeq[(String, String)]
cookies
captures cookies from theCookie
header and represents them asList[Cookie]
setCookie(name)
captures the value & metadata of the aSet-Cookie
header with a matching namesetCookies
captures cookies from theSet-Cookie
header and represents them asList[SetCookie]
body[T, M]
,stringBody
,plainBody[T]
,jsonBody[T]
,binaryBody[T]
,formBody[T]
,multipartBody[T]
captures the bodystreamBody[S]
captures the body as a stream: only a client/server interpreter supporting streams of typeS
can be used with such an endpoint
For outputs:
statusCode
maps to the status code of the response
Combining inputs and outputs¶
Endpoint inputs/outputs can be combined in two ways. However they are combined, the values they represent always accumulate into tuples of values.
First, descriptions can be combined using the .and
method. Such a combination results in an input/output, which maps
to a tuple of the given types, and can be stored as a value and re-used in multiple endpoints. As all other values in
tapir, endpoint input/output descriptions are immutable. For example, an input specifying two query parameters, start
(mandatory) and limit
(optional) can be written down as:
val paging: EndpointInput[(UUID, Option[Int])] =
query[UUID]("start").and(query[Option[Int]]("limit"))
// we can now use the value in multiple endpoints, e.g.:
val listUsersEndpoint: Endpoint[(UUID, Option[Int]), Unit, List[User], Nothing] =
endpoint.in("user" / "list").in(paging).out(jsonBody[List[User]])
Second, inputs can be combined by calling the in
, out
and errorOut
methods on Endpoint
multiple times. Each time
such a method is invoked, it extends the list of inputs/outputs. This can be useful to separate different groups of
parameters, but also to define template-endpoints, which can then be further specialized. For example, we can define a
base endpoint for our API, where all paths always start with /api/v1.0
, and errors are always returned as a json:
val baseEndpoint: Endpoint[Unit, ErrorInfo, Unit, Nothing] =
endpoint.in("api" / "v1.0").errorOut(jsonBody[ErrorInfo])
Thanks to the fact that inputs/outputs accumulate, we can use the base endpoint to define more inputs, for example:
val statusEndpoint: Endpoint[Unit, ErrorInfo, Status, Nothing] =
baseEndpoint.in("status").out(jsonBody[Status])
The above endpoint will correspond to the api/v1.0/status
path.
Mapping over input values¶
Inputs/outputs can also be mapped over. As noted before, all mappings are bi-directional, so that they can be used both when interpreting an endpoint as a server, and as a client, as well as both in input and output contexts.
There’s a couple of ways to map over an input/output. First, there’s the map[II](f: I => II)(g: II => I)
method,
which accepts functions which provide the mapping in both directions. For example:
case class Paging(from: UUID, limit: Option[Int])
val paging: EndpointInput[Paging] =
query[UUID]("start").and(query[Option[Int]]("limit"))
.map((from, limit) => Paging(from, limit))(paging => (paging.from, paging.limit))
Creating a mapping between a tuple and a case class is a common operation, hence there’s also a
mapTo(CaseClassCompanion)
method, which automatically provides the mapping functions:
case class Paging(from: UUID, limit: Option[Int])
val paging: EndpointInput[Paging] =
query[UUID]("start").and(query[Option[Int]]("limit"))
.mapTo(Paging)
Mapping methods can also be called on an endpoint (which is useful if inputs/outputs are accumulated, for example).
The Endpoint.mapIn
, Endpoint.mapInTo
etc. have the same signatures are the ones above.
Path matching¶
By default (as with all other types of inputs), if no path input/path segments are defined, any path will match.
If any path input/path segment is defined, the path must match exactly - any remaining path segments will cause the
endpoint not to match the request. For example, endpoint.in("api")
will match /api
, /api/
, but won’t match
/
, /api/users
.
To match only the root path, use an empty string: endpoint.in("")
will match http://server.com/
and
http://server.com
.
To match a path prefix, first define inputs which match the path prefix, and then capture any remaining part using
paths
, e.g.: endpoint.in("api" / "download").in(paths)"
.
Status codes¶
To provide the status code of a server response, use the statusCode
output, which maps to a
type tapir.model.StatusCode = Int
alias. The tapir.model.StatusCodes
object contains known status codes as
constants. This type of output is used only when interpreting the endpoint as a server.
It is also possible to specify how to determine the status code basing on the value of an output (typically the body). This is used when interpreting the endpoint as a server and when generating documentation.
For example, below is a specification for an endpoint where the error output is fixed to be of type ErrorInfo
;
such a specification can then be refined and reused for other endpoints:
case class ErrorInfo(errorType: ErrorType, msg: String)
val baseEndpoint = endpoint.errorOut(
statusFrom(
jsonBody[ErrorType],
StatusCodes.BadRequest,
whenValue[ErrorType](_.errorType == ErrorType.NotFound, StatusCodes.NotFound),
whenValue[ErrorType](_.errorType == ErrorType.Exception, StatusCodes.InternalServerError)
)
)
The statusFrom
method takes as parameters: the wrapped output, default status code, and any number of specific
status codes mappings based on the value (whenValue
) or class (whenClass
) of the output value.
Codecs¶
A codec specifies how to map from and to raw values that are sent over the network. Raw values, which are natively
supported by client/server interpreters, include String
s, byte arrays, File
s and multiparts.
There are built-in codecs for most common types such as String
, Int
etc. Codecs are usually defined as implicit
values and resolved implicitly when they are referenced.
For example, a query[Int]("quantity")
specifies an input parameter which corresponds to the quantity
query
parameter and will be mapped as an Int
. There’s an implicit Codec[Int]
value that is referenced by the query
method (which is defined in the tapir
package).
In a server setting, if the value cannot be parsed as an int, a decoding failure is reported, and the endpoint
won’t match the request, or a 400 Bad Request
response is returned (depending on configuration).
Optional and multiple parameters¶
Some inputs/outputs allow optional, or multiple parameters:
- path segments are always required
- query and header values can be optional or multiple (repeated query parameters/headers)
- bodies can be optional, but not multiple
In general, optional parameters are represented as Option
values, and multiple parameters as List
values.
For example, header[Option[String]]("X-Auth-Token")
describes an optional header. An input described as
query[List[String]]("color")
allows multiple occurences of the color
query parameter, with all values gathered
into a list.
Implementation note¶
To support optional and multiple parameters, inputs/outputs don’t require implicit Codec
values (which represent
only mandatory values), but CodecForOptional
and CodecForMany
implicit values.
A CodecForOptional
can be used in a context which allows optional values. Given a Codec[T]
, instances of both
CodecForOptional[T]
and CodecForOptional[Option[T]]
will be generated (that’s also the way to add support for
custom optional types). The first one will require a value, and report a decoding failure if a value is missing. The
second will properly map to an Option
, depending if the value is present or not.
Schemas¶
A codec also contains the schema of the mapped type. This schema information is used when generating documentation.
For primitive types, the schema values are built-in, and include values such as Schema.SString
, Schema.SArray
,
Schema.SBinary
etc.
The schema is left unchanged when mapping over a codec, as the underlying representation of the value doesn’t change.
When codecs are derived for complex types, e.g. for json mapping, schemas are looked up through implicit
SchemaFor[T]
values. See json support for more details.
Tapir supports schema generation for coproduct types of the box. In order to extend openApi schema representation a discriminator object can be specified.
For example, given following coproduct:
sealed trait Entity{
def kind: String
}
case class Person(firstName:String, lastName:String) extends Entity {
def kind: String = "person"
}
case class Organization(name: String) extends Entity {
def kind: String = "org"
}
The discriminator may look like:
val sPerson = implicitly[SchemaFor[Person]]
val sOrganization = implicitly[SchemaFor[Organization]]
implicit val sEntity: SchemaFor[Entity] =
SchemaFor.oneOf[Entity, String](_.kind, _.toString)("person" -> sPerson, "org" -> sOrganization)
Media types¶
Codecs carry an additional type parameter, which specifies the media type. Some built-in media types include
text/plain
, application/json
and multipart/form-data
. Custom media types can be added by creating an
implementation of the tapir.MediaType
trait.
Thanks to codec being parametrised by media types, it is possible to have a Codec[MyCaseClass, TextPlain, _]
which
specifies how to serialize a case class to plain text, and a different Codec[MyCaseClass, Json, _]
, which specifies
how to serialize a case class to json. Both can be implicitly available without implicit resolution conflicts.
Different media types can be used in different contexts. When defining a path, query or header parameter, only a codec
with the TextPlain
media type can be used. However, for bodies, any media types is allowed. For example, the
input/output described by jsonBody[T]
requires a json codec.
Custom types¶
Support for custom types can be added by writing a codec from scratch, or mapping over an existing codec. However, custom types can also be supported by mapping over inputs/outputs, not codecs. When to use one and the other?
In general, codecs should be used when translating between raw values and “application-primitives”. Codecs also allow the decoding process to result in an error, or to complete successfully. For example, to support a custom id type:
def decode(s: String): DecodeResult[MyId] = MyId.parse(s) match {
case Success(v) => DecodeResult.Value(v)
case Failure(f) => DecodeResult.Error(s, f)
}
def encode(id: MyId): String = id.toString
implicit val myIdCodec: Codec[MyId, TextPlain, _] = Codec.stringPlainCodecUtf8
.mapDecode(decode)(encode)
Additionally, if a type is supported by a codec, it can be used in multiple contexts, such as query parameters, headers, bodies, etc. Mapped inputs by construction have a fixed context.
On the other hand, when building composite types out of many values, or when an isomorphic representation of a type is needed, but only for a single input/output/endpoint, mapping over an input/output is the simpler solution. Note that while codecs can report errors during decoding, mapping over inputs/outputs doesn’t have this possibility.
Validation¶
While codecs support reporting decoding failures, this is not meant as a validation solution, as it only works on single values, while validation often involves multiple combined values.
Decoding failures should be reported when the input is in an incorrect low-level format, when parsing a “raw value” fails. In other words, decoding failures should be reported for format failures, not business validation errors.
Any validation should be done as part of the “business logic” methods provided to the server interpreters. In case
validation fails, the result can be an error, which is one of the mappings defined in an endpoint
(the E
in Endpoint[I, E, O, S]
).
Next¶
Read on about json support.
Working with JSON¶
Json values are supported through codecs which encode/decode values to json strings. However, third-party libraries are needed for actual json parsing/printing. Currently, Circe is supported. To use, add the following dependency to your project:
"com.softwaremill.tapir" %% "tapir-json-circe" % "0.7.10"
Next, import the package (or extend the TapirJsonCirce
trait, see MyTapir):
import tapir.json.circe._
This will bring into scope Codec
s which, given an in-scope circe Encoder
/Decoder
and a SchemaFor
, will create a
codec using the json media type. Circe includes a couple of approaches to generating encoders/decoders (manual,
semi-auto and auto), so you may choose whatever suits you.
For example, to automatically generate a JSON codec for a case class:
import tapir._
import tapir.json.circe._
import io.circe.generic.auto._
case class Book(author: String, title: String, year: Int)
val bookInput: EndpointIO[Book] = jsonBody[Book]
To add support for other JSON libraries, see the sources for the Circe codec (which is just a couple of lines of code).
Schemas¶
To create a json codec automatically, not only a circe Encoder
/Decoder
is needed, but also an implicit
SchemaFor[T]
value, which provides a mapping between a type T
and its schema. A schema-for value contains a single
schema: Schema
field.
For custom types, schemas are derived automatically using Magnolia, given
that schemas are defined for all of the case class’s fields. It is possible to configure the automatic derivation to use
snake-case, kebab-case or a custom field naming policy, by providing an implicit tapir.generic.Configuration
value:
implicit val customConfiguration: Configuration =
Configuration.default.withSnakeCaseMemberNames
Alternatively, SchemaFor
values can be defined by hand, either for whole case classes, or only for some of its fields.
For example, here we state that the schema for MyCustomType
is a String
:
implicit val schemaForMyCustomType: SchemaFor[MyCustomType] = SchemaFor(Schema.SString)
Next¶
Read on about working with forms.
Form support¶
URL-encoded forms¶
An URL-encoded form input/output can be specified in two ways. First, it is possible to map all form fields as a
Seq[(String, String)]
, or Map[String, String]
(which is more convenient if fields can’t have multiple values):
formBody[Seq[(String, String)]]: EndpointIO[Seq[(String, String)],
MediaType.XWwwFormUrlencoded, _]
formBody[Map[String, String]]: EndpointIO[Map[String, String],
MediaType.XWwwFormUrlencoded, _]
Second, form data can be mapped to a case class. The codec for the case class is generated using a macro at compile-time. The fields of the case class should have types, for which there is a plain text codec. For example:
case class RegistrationForm(name: String, age: Int, news: Boolean, city: Option[String])
formBody[RegistrationForm]
Each form-field is named the same as the case-class-field. The names can be transformed to snake or kebab case by
providing an implicit tapir.generic.Configuraton
.
Multipart forms¶
Similarly as above, multipart form input/outputs can be specified in two ways. To map to all parts of a multipart body, use:
multipartBody[Seq[AnyPart]]: EndpointIO[Seq[AnyPart], MediaType.MultipartFormData, _]
where type AnyPart = Part[_]
. Part
is a case class containing the name
of the part, disposition parameters,
headers, and the body. The bodies will be mappes as byte arrays (Array[Byte]
), unless a custom multipart codec
is defined using the Codec.multipartCodec
method.
As with URL-encoded forms, multipart bodies can be mapped directly to case classes, however without the restriction
on codecs for individual fields. Given a field of type T
, first a plain text codec is looked up, and if one isn’t
found, any codec for any media type (e.g. JSON) is searched for.
Each part is named the same as the case-class-field. The names can be transformed to snake or kebab case by
providing an implicit tapir.generic.Configuraton
.
Additionally, the case class to which the multipart body is mapped can contain both normal fields, and fields of type
Part[T]
. This is useful, if part metadata (e.g. the filename) is relevant.
For example:
case class RegistrationForm(userData: User, photo: Part[File], news: Boolean)
multipartBody[RegistrationForm]
Next¶
Read on about authentication.
Authentication¶
Inputs which carry authentication data wrap another input can be marked as such by declaring them using members of the
auth
object. Apart from predefined codecs for some authentication methods, such inputs will be treated differently]
when generating documentation. Otherwise, they behave as normal inputs which map to the the given type.
Currently, the following authentication inputs are available (assuming import tapir._
):
auth.apiKey(anotherInput)
: wraps any other input and designates it as an api key. The input is typically a header, cookie or a query parameterauth.basic: EndpointInput[UsernamePassword]
: maps to the base64-encoded username/password pair in theAuthorization
headerauth.bearer: EndpointInput[String]
: maps toBearer [token]
in theAuthorization
headerauth.oauth2.authorizationCode(authorizationUrl, tokenUrl, scopes, refreshUrl): EndpointInput[String]
: creates an OAuth2 authorization using authorization code - sign in using an auth service (for documentation, requires defining also theoauth2-redirect.html
, see Generating OpenAPI documentation)
Multiple authentication inputs indicate that all of the given authentication values should be provided. Specifying alternative authentication methods (where only one value out of many needs to be provided) is currently not supported.
When interpreting a route as a server, it is useful to define the authentication input first, to be able to share the authentication logic among multiple endpoints easily. See common server options for more details.
Next¶
Read on a summary on implicits for custom types.
Implicits guide for custom types¶
A could not find implicit value
error can be sometimes puzzling, so here’s a short summary of what kind of implicits
tapir uses for supporting custom types.
In general, when using a custom type in any context, an implicit Codec[T, _, _]
is required. Codecs for custom types
can be either derived automatically, or created basing on existing codecs.
Path, query parameters and headers¶
When using a custom type for a path parameter, query parameter or header value, you’ll need a codec with the
text/plain
media type. You can use an existing codec and map over it, to create a new one. For example:
case class MyId(...)
object MyId {
def parse(s: String): Try[String] = ...
}
def decode(s: String): DecodeResult[MyId] = MyId.parse(s) match {
case Success(v) => DecodeResult.Value(v)
case Failure(f) => DecodeResult.Error(s, f)
}
def encode(id: MyId): String = id.toString
implicit val myIdCodec: Codec[MyId, TextPlain, _] = Codec.stringPlainCodecUtf8
.mapDecode(decode)(encode)
Text and binary bodies¶
The approach for text and binary bodies is the same as for queries/paths/headers. To support a custom types, you’ll
need to map over an existing codec, for example Codec.byteArrayCodec
or Codec.stringPlainCodecUtf8
, and assign]
the result to an implicit value.
JSON bodies¶
When working with json bodies, the custom types can be much more complex than when mapping a query or path parameter.
Using the circe integration, a Codec[T, Json, _]
, where T
is a case class
, can be automatically derived given the
following implicit values:
io.circe.Encoder[T]
io.circe.Decoder[T]
tapir.SchemaFor[T]
The circe encoders/decoders have to be provided using one of the methods supported by Circe, e.g. by importing
import io.circe.generic.auto._
.
The SchemaFor[T]
can be auto-generated using Magnolia, or provided by hand. See json for more details.
In the future, it would be ideal if encoders/decoders could be derived automatically from the schema. For now however, the schema and the json encoders have to be provided separately.
Form bodies¶
When mapping either url-encoded or multipart form bodies, for each field, a plain codec has to be available
in the implicit scope. That is, a value of type Codec[R, TextPlain, _]
, for each R
which is a field of the case
class to which the data is being mapped.
Running as an akka-http server¶
To expose an endpoint as an akka-http server, first add the following dependency:
"com.softwaremill.tapir" %% "tapir-akka-http-server" % "0.7.10"
and import the package:
import tapir.server.akkahttp._
This adds extension methods to the Endpoint
type: toDirective
, toRoute
and toRouteRecoverErrors
. The first two
require the logic of the endpoint to be given as a function of type:
I => Future[Either[E, O]]
The third recovers errors from failed futures, and hence requires that E
is a subclass of Throwable
(an exception);
it expects a function of type I => Future[O]
.
For example:
import tapir._
import tapir.server.akkahttp._
import scala.concurrent.Future
import akka.http.scaladsl.server.Route
def countCharacters(s: String): Future[Either[Unit, Int]] =
Future.successful(Right[Unit, Int](s.length))
val countCharactersEndpoint: Endpoint[String, Unit, Int, Nothing] =
endpoint.in(stringBody).out(plainBody[Int])
val countCharactersRoute: Route = countCharactersEndpoint.toRoute(countCharacters)
Note that these functions take one argument, which is a tuple of type I
. This means that functions which take multiple
arguments need to be converted to a function using a single argument using .tupled
:
def logic(s: String, i: Int): Future[Either[Unit, String]] = ???
val anEndpoint: Endpoint[(String, Int), Unit, String, Nothing] = ???
val aRoute: Route = anEndpoint.toRoute((logic _).tupled)
The created Route
/Directive
can then be further combined with other akka-http directives, for example nested within
other routes. The tapir-generated Route
/Directive
captures from the request only what is described by the endpoint.
It’s completely feasible that some part of the input is read using akka-http directives, and the rest using tapir endpoint descriptions; or, that the tapir-generated route is wrapped in e.g. a metrics route. Moreover, “edge-case endpoints”, which require some special logic not expressible using tapir, can be always implemented directly using akka-http. For example:
val myRoute: Route = metricsDirective {
securityDirective { user =>
tapirEndpoint.toRoute(input => /* here we can use both `user` and `input` values */)
}
}
Streaming¶
The akka-http interpreter accepts streaming bodies of type Source[ByteString, Any]
, which can be used both for sending
response bodies and reading request bodies. Usage: streamBody[Source[ByteString, Any]](schema, mediaType)
.
Configuration¶
The interpreter can be configured by providing an implicit AkkaHttpServerOptions
value and status mappers, see
common server configuration for details.
Defining an endpoint together with the server logic¶
It’s also possible to define an endpoint together with the server logic in a single, more concise step. See common server configuration for details.
Running as an http4s server¶
To expose an endpoint as an http4s server, first add the following dependency:
"com.softwaremill.tapir" %% "tapir-http4s-server" % "0.7.10"
and import the package:
import tapir.server.http4s._
This adds two extension methods to the Endpoint
type: toRoutes
and toRoutesRecoverErrors
. This first requires the
logic of the endpoint to be given as a function of type:
I => F[Either[E, O]]
where F[_]
is the chosen effect type. The second recovers errors from failed effects, and hence requires that E
is
a subclass of Throwable
(an exception); it expects a function of type I => F[O]
. For example:
import tapir._
import tapir.server.http4s._
import cats.effect.IO
import org.http4s.HttpRoutes
import cats.effect.ContextShift
// will probably come from somewhere else
implicit val cs: ContextShift[IO] =
IO.contextShift(scala.concurrent.ExecutionContext.global)
def countCharacters(s: String): IO[Either[Unit, Int]] =
IO.pure(Right[Unit, Int](s.length))
val countCharactersEndpoint: Endpoint[String, Unit, Int, Nothing] =
endpoint.in(stringBody).out(plainBody[Int])
val countCharactersRoutes: HttpRoutes[IO] =
countCharactersEndpoint.toRoutes(countCharacters _)
Note that these functions take one argument, which is a tuple of type I
. This means that functions which take multiple
arguments need to be converted to a function using a single argument using .tupled
:
def logic(s: String, i: Int): IO[Either[Unit, String]] = ???
val anEndpoint: Endpoint[(String, Int), Unit, String, Nothing] = ???
val aRoute: Route = anEndpoint.toRoute((logic _).tupled)
The created HttpRoutes
are the usual http4s Kleisli
-based transformation of a Request
to a Response
, and can
be further composed using http4s middlewares or request-transforming functions. The tapir-generated HttpRoutes
captures from the request only what is described by the endpoint.
It’s completely feasible that some part of the input is read using a http4s wrapper function, which is then composed with the tapir endpoint descriptions. Moreover, “edge-case endpoints”, which require some special logic not expressible using tapir, can be always implemented directly using http4s.
Streaming¶
The http4s interpreter accepts streaming bodies of type Stream[F, Byte]
, which can be used both for sending
response bodies and reading request bodies. Usage: streamBody[Stream[F, Byte]](schema, mediaType)
.
Configuration¶
The interpreter can be configured by providing an implicit Http4sServerOptions
value and status mappers, see
common server configuration for details.
The http4s options also includes configuration for the blocking execution context to use, and the io chunk size.
Defining an endpoint together with the server logic¶
It’s also possible to define an endpoint together with the server logic in a single, more concise step. See common server configuration for details.
Common server options¶
Status codes¶
By default, successful responses are returned with the 200 OK
status code, and errors with 400 Bad Request
. However,
this can be customised by specifying how an output maps to the status code.
Defining an endpoint together with the server logic¶
It’s possible to combine an endpoint description with the server logic in a single object,
ServerEndpoint[I, E, O, S, F]
. Such an endpoint contains not only an endpoint of type Endpoint[I, E, O, S]
, but
also a logic function I => F[Either[E, O]]
, for some effect F
.
For example, the book example can be more concisely written as follows:
import tapir._
import tapir.server.akkahttp._
import scala.concurrent.Future
import akka.http.scaladsl.server.Route
val countCharactersServerEndpoint: ServerEndpoint[String, Unit, Int, Nothing, Future] =
endpoint.in(stringBody).out(plainBody[Int]).serverLogic { s =>
Future.successful(Right[Unit, Int](s.length))
}
val countCharactersRoute: Route = countCharactersServerEndpoint.toRoute
A ServerEndpoint
can then be converted to a route using .toRoute
/.toRoutes
methods (without any additional
parameters), or to documentation.
Moreover, a list of server endpoints can be converted to routes or documentation as well:
val endpoint1 = endpoint.in("hello").out(stringBody)
.serverLogic { _ => Future.successful("world") }
val endpoint2 = endpoint.in("ping").out(stringBody)
.serverLogic { _ => Future.successful("pong") }
val route: Route = List(endpoint1, endpoint2).toRoute
Note that when dealing with endpoints which have multiple input parameters, the server logic function is a function
of a single argument, which is a tuple; hence you’ll need to pattern-match using case
to extract the parameters:
val echoEndpoint = endpoint
.in(query[Int]("count"))
.in(stringBody)
.out(stringBody)
.serverLogic { case (count, body) =>
Future.successful(body * count)
}
Server options¶
Each interpreter accepts an implicit options value, which contains configuration values for:
- how to create a file (when receiving a response that is mapped to a file, or when reading a file-mapped multipart part)
- how to handle decode failures
To customise the server options, define an implicit value, which will be visible when converting an endpoint or multiple
endpoints to a route/routes. For example, for AkkaHttpServerOptions
:
implicit val customServerOptions: AkkaHttpServerOptions = AkkaHttpServerOptions.default.copy(...)
Handling decode failures¶
Quite often user input will be malformed and decoding will fail. Should the request be completed with a
400 Bad Request
response, or should the request be forwarded to another endpoint? By default, tapir follows OpenAPI
conventions, that an endpoint is uniquely identified by the method and served path. That’s why:
- an “endpoint doesn’t match” result is returned if the request method or path doesn’t match. The http library should attempt to serve this request with the next endpoint.
- otherwise, we assume that this is the correct endpoint to serve the request, but the parameters are somehow
malformed. A
400 Bad Request
response is returned if a query parameter, header or body is missing / decoding fails, or if the decoding a path capture fails with an error (but not a “missing” decode result).
This can be customised by providing an implicit instance of tapir.server.DecodeFailureHandler
, which basing on the
request, failing input and failure description can decide, whether to return a “no match” or a specific response.
Only the first failure is passed to the DecodeFailureHandler
. Inputs are decoded in the following order: method,
path, query, header, body.
Extracting common route logic¶
Quite often, especially for authentication, some part of the route logic is shared among multiple endpoints. However, these functions don’t compose in a straightforward way, as authentication usually operates on a single input, which is only a part of the whole logic’s input. Suppose you have the following methods:
type AuthToken = String
def authFn(token: AuthToken): Future[Either[ErrorInfo, User]]
def logicFn(user: User, data: String, limit: Int): Future[Either[ErrorInfo, Result]]
which you’d like to apply to an endpoint with type:
val myEndpoint: Endpoint[(AuthToken, String, Int), ErrorInfo, Result, Nothing] = ...
To avoid composing these functions by hand, tapir defines helper extension methods, andThenFirst
and andTheFirstE
.
The first one should be used when errors are represented as failed wrapper types (e.g. failed futures), the second
is errors are represented as Either
s.
This extension method is defined in the same traits as the route interpreters, both for Future
(in the akka-http
interpreter) and for an arbitrary monad (in the http4s interpreter), so importing the package is sufficient to use it:
import tapir.server.akkahttp._
val r: Route = myEndpoint.toRoute((authFn _).andThenFirstE((logicFn _).tupled))
Writing down the types, here are the generic signatures when using andThenFirst
and andThenFirstE
:
f1: T => Future[U]
f2: (U, A1, A2, ...) => Future[O]
(f1 _).andThenFirst(f2): (T, A1, A2, ...) => Future[O]
f1: T => Future[Either[E, U]]
f2: (U, A1, A2, ...) => Future[Either[E, O]]
(f1 _).andThenFirstE(f2): (T, A1, A2, ...) => Future[Either[E, O]]
Exception handling¶
There’s no exception handling built into tapir. However, tapir contains a more general error handling mechanism, as the endpoints can contain dedicated error outputs.
If the logic function, which is passed to the server interpreter, fails (i.e. throws an exception, which results in
a failed Future
or IO
/Task
), this is propagated to the library (akka-http or http4s).
However, any exceptions can be recovered from and mapped to an error value. For example:
type ErrorInfo = String
def logic(s: String): Future[Int] = ...
def handleErrors[T](f: Future[T]): Future[Either[ErrorInfo, T]] =
f.transform {
case Success(v) => Success(Right(v))
case Failure(e) =>
logger.error("Exception when running endpoint logic", e)
Success(Left(e.getMessage))
}
endpoint
.errorOut(plainBody[ErrorInfo])
.out(plainBody[Int])
.in(query[String]("name"))
.toRoute((logic _).andThen(handleErrors))
In the above example, errors are represented as String
s (aliased to ErrorInfo
for readability). When the
logic completes successfully an Int
is returned. Any exceptions that are raised are logged, and represented as a
value of type ErrorInfo
.
Following the convention, the left side of the Either[ErrorInfo, T]
represents an error, and the right side success.
Alternatively, errors can be recovered from failed effects and mapped to the error output - provided that the E
type
in the endpoint description is itself a subclass of exception. This can be done using the toRouteRecoverErrors
method.
Debugging servers¶
When dealing with multiple endpoints, how to find out which endpoint handled a request, or why an endpoint didn’t handle a request?
For this purpose, tapir provides optional logging. The logging options (and messages) can be customised by changing
the default LoggingOptions
class, which is part of server options.
The following can be logged:
DEBUG
-log, when a request is handled by an endpoint, or when the inputs can’t be decoded, and the decode failure maps to a responseDEBUG
-log, when the inputs can’t be decoded, and the decode failure doesn’t map to a response (the next endpoint will be tried)ERROR
-log, when there’s an exception during evaluation of the server logic
By default, logs of type (1) and (3) are logged. Logging all decode failures (2) might be helpful when debugging, but can also produce a large amount of logs.
Even if logging for a particular category (as described above) is set to true
, normal logger rules apply - if you
don’t see the logs, please verify your logging levels for the appropriate packages.
Using as an sttp client¶
Add the dependency:
"com.softwaremill.tapir" %% "tapir-sttp-client" % "0.7.10"
To make requests using an endpoint definition using sttp, import:
import tapir.client.sttp._
This adds the toRequest(Uri)
extension method to any Endpoint
instance which, given the given base URI returns a
function:
[I as function arguments] => Request[Either[E, O], Nothing]
After providing the input parameters, the result is a description of the request to be made, which can be further customised and sent using any sttp backend.
See the runnable example for example usage.
Generating OpenAPI documentation¶
To use, add the following dependencies:
"com.softwaremill.tapir" %% "tapir-openapi-docs" % "0.7.10"
"com.softwaremill.tapir" %% "tapir-openapi-circe-yaml" % "0.7.10"
Tapir contains a case class-based model of the openapi data structures in the openapi/openapi-model
subproject (the
model is independent from all other tapir modules and can be used stand-alone).
An endpoint can be converted to an instance of the model by importing the tapir.docs.openapi._
package and calling
the provided extension method:
import tapir.openapi.OpenAPI
import tapir.docs.openapi._
val docs: OpenAPI = booksListing.toOpenAPI("My Bookshop", "1.0")
Such a model can then be refined, by adding details which are not auto-generated. Working with a deeply nested case
class structure such as the OpenAPI
one can be made easier by using a lens library, e.g. Quicklens.
The openapi case classes can then be serialised, either to JSON or YAML using Circe:
import tapir.openapi.circe.yaml._
println(docs.toYaml)
Exposing OpenAPI documentation¶
Exposing the OpenAPI documentation can be very application-specific. For example, to expose the docs using the Swagger UI and akka-http:
- add
libraryDependencies += "org.webjars" % "swagger-ui" % "3.22.0"
tobuild.sbt
(or newer) - generate the yaml content to serve as a
String
using tapir:
import tapir.docs.openapi._
import tapir.openapi.circe.yaml._
val docsAsYaml: String = myEndpoints.toOpenAPI("My App", "1.0").toYaml
- add the following routes to your server:
val SwaggerYml = "swagger.yml"
private val redirectToIndex: Route =
redirect(s"/swagger/index.html?url=/swagger/$SwaggerYml", StatusCodes.PermanentRedirect)
// needed only if you use oauth2 authorization
private def redirectToOath2(query: String): Route =
redirect(s"/swagger/oauth2-redirect.html$query", StatusCodes.PermanentRedirect)
private val swaggerVersion = {
val p = new Properties()
p.load(getClass.getResourceAsStream("/META-INF/maven/org.webjars/swagger-ui/pom.properties"))
p.getProperty("version")
}
val routes: Route =
pathPrefix("swagger") {
pathEndOrSingleSlash {
redirectToIndex
} ~ path(SwaggerYml) {
complete(yml)
} ~ getFromResourceDirectory(s"META-INF/resources/webjars/swagger-ui/$swaggerVersion/")
} ~
// needed only if you use oauth2 authorization
path("oauth2-redirect.html") { request =>
redirectToOath2(request.request.uri.rawQueryString.map(s => '?' + s).getOrElse(""))(request)
}
Creating your own Tapir¶
Tapir uses a number of packages which contain either the data classes for describing endpoints or interpreters of this data (turning endpoints into a server or a client). Importing these packages every time you want to use Tapir may be tedious, that’s why each package object inherits all of its functionality from a trait.
Hence, it is possible to create your own object which combines all of the required functionalities and provides a single-import whenever you want to use tapir. For example:
object MyTapir extends Tapir
with TapirAkkaHttpServer
with TapirSttpClient
with TapirCirceJson
with TapirOpenAPICirceYaml
Then, a single import MyTapir._
and all Tapir data types and extensions methods will be in scope!
Contributing¶
Tapir is an early stage project. Everything might change. All suggestions welcome :)
See the list of issues and pick one! Or report your own.
If you are having doubts on the why or how something works, don’t hesitate to ask a question on gitter or via github. This probably means that the documentation, scaladocs or code is unclear and can be improved for the benefit of all.