Gavin Bisesi • @Daenyth
@47Degrees
Questions at the end
Programming with "pure functions", aka programming with Referential Transparency
A property of expressions (not statements (return
) or declarations (def
, class
))
A property of expressions (not statements (return
) or declarations (def
, class
))
The result of any expression can be replaced by its definition without changing the meaning
The result of any expression can be replaced by its definition without changing the meaning
val two = 1 + 1// two: Int = 2two// res0: Int = 21 + 1// res1: Int = 2
👍
The result of any expression can be replaced by its definition without changing the meaning
val hello = println("hello")// hellohelloprintln("hello")// hello
👎
Why program with Referential Transparency?
RT means we can:
In other words:
We use FP because we want a codebase that
(among other reasons)
Combinators are composable building block functions that modify some input in a reusable way
map
filter
import cats._import cats.syntax.all._import cats.effect._import cats.effect.implicits._import scala.concurrent._import scala.concurrent.duration._import ExecutionContext.Implicits.{global => futureEC}import cats.effect.unsafe.implicits.{global => ioRuntime}// Custom `unsafeRunSync` for mdoc reasonsimport daenyth.talk.ce.MdocConsoleWorkaround._def yolo[A](description: IO[A]) = description.unsafeRunSyncWithRedirect()
IO
A lot like scala Future
at first glance
apply
to wrap blocks of codemap
, flatMap
Await
with Future
; don't use unsafeRun*
with IO
val rndUUID: IO[UUID] = IO(UUID.randomUUID())// rndUUID: IO[UUID] = Delay(// <function0>,// cats.effect.tracing.TracingEvent$StackTrace// )val helloIO = rndUUID.flatMap(uuid => IO(println(s"Hello $uuid")))// helloIO: IO[Unit] = FlatMap(// Delay(<function0>, cats.effect.tracing.TracingEvent$StackTrace),// <function1>,// cats.effect.tracing.TracingEvent$StackTrace// )yolo(helloIO >> helloIO)// Hello a1fb5ebf-11dc-410b-8258-4872e36cf32e// Hello fa27fb78-11a5-4a21-acff-f0c3028cbea8
IO[A]
valuesIO[A]
describes a computation that will:
A
, orThrowable
, ordef game = for { _ <- IO.println("Guess a number") n <- IO.readLine _ <- if (n == 10) IO.println("Yay!") else IO.raiseError(new Exception("wrong"))} yield ()def clock: IO[Unit] = IO.sleep(1.minute) >> Clock[IO].realTimeInstant.flatMap(IO.println) >> clock
IO
is based on the Fiber
abstraction for low-level concurrency
// Simplifiedtrait IO[A] { def start: Fiber[A]}trait Fiber[A] { def cancel: F[Unit] def join: F[Outcome[F, Throwable, A]]}
A low level building block - you usually don't need to interact with them directly
IO
ExecutionFiber
s are not a problemIO
ConcurrencyBoth low-level Fibers
and high-level combinators
io.start
, Fiber
Fiber#cancel
IO.race(first, second)
- concurrently execute, return the winner and cancel the loser,io.timeout(duration)
- fails with TimeoutException
if not complete within duration
io.timeoutTo(duration, fallbackIO)
- Execute fallbackIO
if io
is not complete within duration
IO.sleep(duration)
start
, listOfIO.parSequence
, (ioA, ioB).parTupled
Resource[IO, A]
describes the ability to initialize and release a resource
try
/finally
, but composable and referentially transparentcatch
/finally
aren't transparent - they don't talk about return values.Resource[IO, A]
parameter will acquire and release internallyA
reuses the live valueuse
completes, fails, or is cancelled, the Resource is closed. Will not leak.All the capabilities that IO
exposes are described by typeclasses, allowing more generic code and multi-library compatibility.
This talk focus on IO
and not the typeclasses
IO
vs Future
- OverviewActions:
IO
is a value that describes an action (possibly asynchronous)Future
is a handle to the result of an already-running action (possibly asynchronous)IO
vs Future
- SpeedIO
is optimized for throughputFuture
is optimized for fairnessmap
/flatMap
(hence implicit ec
)ec
argumentIO
vs Future
- CancellationFuture[A]
can't be cancelled - once constructed, it can't be stoppedIO[A]
can be concurrently forked, and then either join
ed or cancel
ed IO
vs Future
- ConcurrencyIO
vs Future
- Concurrency with Future// Sequential executiondef jobOne: Future[Int] = Future(???)def jobTwo: Future[String] = Future(???)jobOne.flatMap(i => jobTwo.map(s => (i, s)))
// Concurrent executionval jobOne: Future[Int] = Future(???)val jobTwo: Future[String] = Future(???)jobOne.flatMap(i => jobTwo.map(s => (i, s)))
IO
vs Future
- Concurrency with FutureDoes this expression evaluate concurrently or sequentially?
jobOne.flatMap(i => jobTwo.map(s => (i, s)))
Impossible to tell - have to read the implementation details
IO
vs Future
- Concurrency with IO
val jobOne: IO[Int] = IO(???)val jobTwo: IO[String] = IO(???)// Sequential executionjobOne.flatMap(i => jobTwo.map(s => (i, s)))// Also Sequentialval result1: IO[(Int, String)] = (jobOne, jobTwo).tupled// Concurrent execution (manually)for { j1Fiber <- jobOne.start j2Fiber <- jobTwo.start i <- j1Fiber.join s <- j2Fiber.join} yield (i, s)// Concurrent execution (higher level)val result2: IO[(Int, String)] = (jobOne, jobTwo).parTupled
N
permits, can acquire or release permits as needed. Acquire blocks until permits are availablevar
Dispatcher - Allows you to have controlled regions of unsafeRun*
methods, for wrapping imperative libraries
IOApp
- Whole "pure" applications
IO
IORuntime
ExecutionContext.global
global
isn't an ideal choice.
global
creates more threads than cpu coresglobal
decides not to make a new ThreadIO
values run hereIO.blocking
or IO.interruptible
(if you can Thread.interrupt()
it)sleep
callsBlocking threads should be avoided
Instead of IO.apply
, use:
IO.blocking
to wrap thread-blocking codeIO.interruptible
if that code will behave well under Thread.interrupt()
IO.blocking
IO.blocking
to wrap thread-blocking codedef fos: java.io.FileOutputStream = ???def bytes: Array[Byte] = ???val writeFile: IO[Unit] = IO.blocking(fos.write(bytes))
IO.interruptible
IO.interruptible
if that code will behave well under Thread.interrupt()
val tenMinutes = 10 * 60 * 1000// tenMinutes: Int = 600000def blockingTask = IO.interruptible(many = false)(Thread.sleep(tenMinutes))def printTime = Clock[IO].realTimeInstant.flatMap(t => IO.println(s"It is now $t"))yolo(for { _ <- printTime task <- blockingTask.start _ <- IO.sleep(2.seconds) _ <- task.cancel _ <- printTime} yield ())// It is now 2021-08-02T15:18:30.698Z// It is now 2021-08-02T15:18:32.714Z
Temporal
and Clock
IO.sleep(duration)
(or Temporal[IO].sleep(duration)
)Clock[IO].realTime
, .realTimeInstant
// Time since epochyolo(Clock[IO].realTime)// res12: FiniteDuration = 1627917512718 milliseconds// Current Instantyolo(Clock[IO].realTimeInstant)// res13: java.time.Instant = 2021-08-02T15:18:32.720Z// Monotonically incrementing clockyolo(Clock[IO].monotonic)// res14: FiniteDuration = 288656489830275 nanoseconds
IOApp
for your "main" classes
object MyMain extends IOApp { def run(args: List[String]): IO[ExitCode] = { val myAppResource = for { _ <- Resource.eval(IO.println("hello cats")) db <- getDatabase result <- Resource.eval(myAppLogic(db)) _ <- Resource.eval(IO.println(s"got $result")) } yield () myAppResource.useForever.as(ExitCode.Success) } def getDatabase: Resource[IO, Database] = ??? def myAppLogic(db: Database): IO[Int] = ???}
IO
can be introduced to a codebase using Future
or using its own "main" class.
Future
from IO
Tip: Instantiating a Future
value is a side effect, so it gets wrapped with IO.apply
def existingLogic(x: Int): Future[Int] = ???def moreLogic(y: Int): IO[Int] = for { xResult <- IO.fromFuture(IO(existingLogic(42))) } yield xResult + y
IO
from Future
import cats.effect.unsafetrait MyTrait { def doWork: Future[Int] }// NB: You can use `IORuntime.global`, but it's more flexible to take a parameterclass MyTraitImpl(implicit ioRuntime: unsafe.IORuntime) extends MyTrait { override def doWork: Future[Int] = ioLogic.map(_.length).unsafeToFuture() def ioLogic: IO[String] = ???}
Sequential code:
;
with flatMap
// Imperativedef oneStep(): Unit = println("one")def anotherStep(): Unit = println("two")oneStep()anotherStep()// IOdef oneStepIO(): IO[Unit] = IO(println("one"))def anotherStepIO(): IO[Unit] = IO(println("two"))for { _ <- oneStepIO() _ <- anotherStepIO()} yield ()
IO
- Future
codeFuture.apply
-> IO.apply
Future.successful
-> IO.pure
Future.failed
-> IO.raiseError
Future
concurrency -> Explicit concurrency combinatorsmap
-> explicit effects with IO(sideEffect())
in flatMap
IO
- imperative code and mutationnew
on a stateful class is stateful; wrap it in IO
IO
Resource.make
Random[IO]
for pure, performant, concurrecy-safe randomness. Avoid IO(Random.xyz())
Clock[IO]
for pure access to time.IO
expressions combined with flatMap
Foo
value but I have IO[Foo]
, how do I get it out?flatMap
; getFooIO.flatMap(foo => useFooIO(foo))
IO Examples
import cats.effect.std.Randomval ioExample = for { _ <- IO.println("Hello cats!") // Uses the `Console[IO]` capability time <- Clock[IO].realTimeInstant _ <- IO.println(s"It is now $time") rnd <- Random.scalaUtilRandom[IO] _ <- rnd.nextInt.flatMap(n => IO.println(s"Your lucky number is: $n"))} yield ()
yolo(ioExample)// Hello cats!// It is now 2021-08-02T15:18:32.737Z// Your lucky number is: -514751454
IO
- general tipsimport cats.syntax.all._
for generic combinators (eg tupled
, handleError
)import cats.effect.implicits._
for effect methods (eg parSequence
, parTupled
, parMapN
)IO
, you need either an IORuntime
or create a Dispatcher[IO]
.IORuntime.global
is fine for production! It's not like ExecutionContext.global
Dispatcher[F]
is cheap and works with "capability trait" style (aka "tagless final").unsafe*
methods on IO
IO
- Common errorsIO
without flatMap-ing itIO
, eg in map
Future
concurrency-Ywarn-value-discard
!sbt-tpolecat
IO
def doStuff: IO[Unit] = ???def oops = { // This IO never executes! IO.println("About to do stuff") doStuff}def correct = IO.println("About to do stuff") .flatMap(_ => doStuff)
Tip: fa.flatMap(_ => fb)
is so common it has an operator; fa >> fb
IO
- Application structureIO
Resource
usage and simpler data sharing examplesCode and slides at daenyth/intro-cats-effect
on GitHub
Keyboard shortcuts
↑, ←, Pg Up, k | Go to previous slide |
↓, →, Pg Dn, Space, j | Go to next slide |
Home | Go to first slide |
End | Go to last slide |
b / m / f | Toggle blackout / mirrored / fullscreen mode |
c | Clone slideshow |
p | Toggle presenter mode |
t | Restart the presentation timer |
?, h | Toggle this help |
Esc | Back to slideshow |