Fork me on GitHub

A Whirlwind Tour of Scalameta

For any Scalameta related questions, don't hesitate to ask on our gitter channel: Join the chat at

Note. This tutorial was originally created for a workshop at the conference. The workshop material turned out to be useful for many so it has been moved here. You will still find occasional references to


There are several ways to install and run Scalameta.


You can use Scalameta as a library, the scalameta umbrella package includes modules for trees, tokens, parsing, pretty printing, semantic API and more,
libraryDependencies += "org.scalameta" %% "scalameta" % "1.8.0"
Optionally, for extra utilities, you can use Contrib
libraryDependencies += "org.scalameta" %% "contrib" % "1.8.0"


The examples mentioned in this tutorial are available in a single repository that you can clone and run locally.
  1. Clone the workshop repo. Alternatively, for a minimal project template that uses the bleeding edge version of scalameta/paradise, clone this repo.
  2. Run sbt test to make sure everything works.
  3. Open the file Playground.scala
    package scalaworld
    import scala.meta._
    import org.scalameta.logger // useful for debugging
    class Playground extends org.scalatest.FunSuite {
      import scala.meta._
      test("part 1: tokens") {
        val tokens = "val x = 2".tokenize.get
      test("part 2: trees") {
        val tree = "val x = 2".parse[Stat].get
  4. To test playground on every edit, run sbt ~library/test.
  5. Setup the project in your favorite IDE, for example IntelliJ, ENSIME or vim.

Ammonite REPL

To experiment with Scalameta in the REPL, you can run the following in the Ammonite REPL
import $ivy.`org.scalameta:scalameta_2.11:1.8.0`, scala.meta._
Note. The macro annotation examples will not run in the REPL, follow scalameta/paradise#10 for updates.


To accompany the workshop, here is the recording from the original conference talk.


Make sure you have Scalameta installed as a library from Setup. You can decide to run these examples from the console or from sbt, for example in the tutorial repo.

This whole workshop will assume you have this import in scope:

scala> import scala.meta._
import scala.meta._

Here's how to tokenize a small expression.

scala> "val x = 2".tokenize.get
res0: scala.meta.tokens.Tokens = Tokens(, val,  , x,  , =,  , 2, )
Let's discuss the most interesting methods on tokens.


The simplest method we can call is Tokens.syntax The method returns a string representation of the actual code behind the tokens, or how the code should look like to a developer.

scala> "val x = 2".tokenize.get.syntax
res0: String = val x = 2

Tokens.toString() uses .syntax behind the scenes. However, you should never rely on toString() when manipulating Scalameta structures, prefer to explicitly call .syntax. It's maybe not so obvious why right now but it will make more sense soon.


Another useful method is Tokens.structure. The method shows details that may be relevant to us as metaprogrammers.

scala> "val x = 2".tokenize.get.structure
res0: String = Tokens(BOF [0..0), val [0..3),   [3..4), x [4..5),   [5..6), = [6..7),   [7..8), 2 [8..9), EOF [9..9))

.structure is often useful for debugging and testing.

Tokens vs. Token

The class Tokens is a wrapper around a sequence of Token objects. There are multiple subtypes of Token while there only one type Tokens.

scala> "val x = 2".tokenize.get.head
res0: scala.meta.tokens.Token =

BOF stands for "Beginning of file". Let's see what other kinds of token types are in the string
scala> "val x = 2".tokenize.get.
  map(x => f"${x.structure}%10s -> ${x.productPrefix}").
res0: String =
BOF [0..0) -> BOF
val [0..3) -> KwVal
    [3..4) -> Space
  x [4..5) -> Ident
    [5..6) -> Space
  = [6..7) -> Equals
    [7..8) -> Space
  2 [8..9) -> Int
EOF [9..9) -> EOF

Even spaces get their own tokens. The [0...3) part indicates that the val tokens start at offset 0 and ends at offset 3.


How does token equality look like?

scala> "foobar".tokenize.get(1) == "foobar kas".tokenize.get(1)
res0: Boolean = false
Huh, why are they not the same?

Token equality is implemented with reference equality. You need to be explicit if you actually mean syntactic (.syntax), or structural (.structure) equality.

The tokens are syntactically equal.
scala> "foobar".tokenize.get(1).syntax == "foobar kas".tokenize.get(1).syntax
res0: Boolean = true
Even if we move the tokens around
scala> "kas foobar".tokenize.get(3).syntax == "foobar kas".tokenize.get(1).syntax
res0: Boolean = true
The tokens are also structurally equal.
scala> "foobar".tokenize.get(1).structure == "foobar kas".tokenize.get(1).structure
res0: Boolean = true
However, they are not structurally equal if we move them around.
scala> "kas foobar".tokenize.get(3).structure == "foobar kas".tokenize.get(1).structure
res0: Boolean = false


Tokenization can sometimes fail, for example in this case:

scala> """ val str = "unclosed literal """.tokenize
res0: scala.meta.tokenizers.Tokenized =
<input>:1: error: unclosed string literal
val str = "unclosed literal

If you prefer, you can safely pattern match on the tokenize result

scala> """ val str = "closed literal" """.tokenize match {
  case tokenizers.Tokenized.Success(tokenized) => tokenized
  case tokenizers.Tokenized.Error(e, _, _) => ???
res0: scala.meta.tokens.Tokens = Tokens(,  , val,  , str,  , =,  , "closed literal",  , )


Scalameta tokens are the foundation of Scalameta. Sometimes you don't have access to a parsed AST and then your best shot is work with tokens.

In the following chapter we will discuss another exciting data structure: the incredible scala.meta.Tree.


Reminder. We assume you have this import in scope:

scala> import scala.meta._
import scala.meta._


The easiest way to get started with Scalameta trees is using quasiquotes.

scala> q"case class User(name: String, age: Int)"
res0: meta.Defn.Class = case class User(name: String, age: Int)

Quasiquotes can be composed

scala> val method = q"def `is a baby` = age < 1"
method: meta.Defn.Def = def `is a baby` = age < 1

scala> q"""
case class User(name: String, age: Int) {
res0: meta.Defn.Class = case class User(name: String, age: Int) { def `is a baby` = age < 1 }

Quasiquotes can also be used to deconstruct trees with pattern matching

scala> q"def `is a baby` = age < 1" match {
  case q"def $name = $body" =>
    s"You ${name.syntax} if your ${body.syntax}"
res0: String = You `is a baby` if your age < 1

NOTE. Quasiquotes currently ignore comments:

scala> q"val x = 2 // assignment".syntax
res0: String = val x = 2
If you need comments, you can use .parse[T]
scala> "val x = 2 // assignment".parse[Stat].get.syntax
res0: String = val x = 2 // assignment


If the contents that you want to parse are only known at runtime, you can't use quasiquotes. For example, this happens when you need to parse file contents.

Here's how to parse a compilation unit.

scala> "object Main extends App { println(1) }".parse[Source].get
res0: scala.meta.Source = object Main extends App { println(1) }

Pro tip. You can also call .parse[T] on a File, just like this

scala> new"readme/ParseMe.scala").parse[Source]
res0: scala.meta.parsers.Parsed[scala.meta.Source] =
class ParseMe { println("I'm inside a file") }

If we try to parse a statement as a compilation unit we will fail.

scala> "val x = 2".parse[Source]
res0: scala.meta.parsers.Parsed[scala.meta.Source] =
<input>:1: error: expected class or object definition
val x = 2

We need to explicitly parse it as a statement (Stat).

scala> "val x = 2".parse[Stat].get
res0: scala.meta.Stat = val x = 2

We can also parse case statement

scala> "case Foo(bar) if bar > 2 => println(bar)".parse[Case].get
res0: scala.meta.Case = case Foo(bar) if bar > 2 => println(bar)

Scalameta has dozens of parsers:

However, .parse[Stat] and .parse[Source] are usually all you need.


I didn't tell the whole story when I said you need to pass in a type argument to parse statements. You also need to pass in a dialect! However, Scalameta will by default pick the Scala211 dialect for you if you don't provide one explicitly.

With the SBT dialects, we can parse vals as top-level statements.

scala> dialects.Sbt0137(
  "lazy val core = project.settings(commonSettings)"
res0: scala.meta.Source = lazy val core = project.settings(commonSettings)

We can even parse multiple top level statements

scala> dialects.Sbt0137(
  lazy val core = project.settings(commonSettings)

  lazy val extra = project.dependsOn(core)
res0: scala.meta.Source =

  lazy val core = project.settings(commonSettings)

  lazy val extra = project.dependsOn(core)

For the remainder of the workshop, we will only work with the Scala211 dialect.


Just like with tokens, we can also run .syntax on trees.

scala> "foo(bar)".parse[Stat].get.syntax
res0: String = foo(bar)
However, Scalameta can also do this even if you manually construct the tree
scala> Term.Apply(
    Term.Name("bar"): Term.Arg
res0: String = foo(bar)

We never gave Scalameta parentheses but still it figured out we needed them. Pretty cool huh.


Just like with tokens, we can also run .structure on trees.

scala> "foo(bar)".parse[Stat].get.structure
res0: String = Term.Apply(Term.Name("foo"), Seq(Term.Name("bar")))

.structure ignores any syntactic trivia like whitespace and comments

scala> "foo  ( /* this is a comment */ bar  ) // eol".parse[Stat].get.structure
res0: String = Term.Apply(Term.Name("foo"), Seq(Term.Name("bar")))

This can be useful for example in debugging, testing or equality checking.


You can collect on Scalameta.Tree just like regular collections.

scala> source"""sealed trait Op[A]
    object Op extends B {
      case class Foo(i: Int) extends Op[Int]
      case class Bar(s: String) extends Op[String]
    }""".collect { case cls: Defn.Class => }
res0: List[meta.Type.Name] = List(Foo, Bar)


Transform Scalameta.Tree with .transform.

scala> q"myList.filter(_ > 3 + a).headOption // comments are removed :(".transform {
  case q"$lst.filter($cond).headOption" => q"$lst.find($cond)"
res0: scala.meta.Tree = myList.find(_ > 3 + a)

.transform does not preserve syntactic details such as comments and formatting. There has been made some work on source aware transformation, see #457, but it still requires a bit more work.


Just like with tokens, tree equality is by default by reference:

scala> q"foo(bar)" == q"foo(bar)"
res0: Boolean = false
This means you need to be explicit if you mean syntactic equality
scala> q"foo(bar)".syntax == q"foo(bar)".syntax
res0: Boolean = true

or structural equality

scala> q"foo(bar)".structure == q"foo(bar)".structure
res0: Boolean = true

Comprehensive trees

A key feature of Scalameta trees is that they comprehensively cover all corners of the Scala syntax. A side effect of this is that the Scalameta tree hierarchy containsĀ a lot of types. For example, there is a different tree node for an abstract def (Decl.Def)

scala> q"def add(a: Int, b: Int)" // Decl.Def
res0: meta.Decl.Def = def add(a: Int, b: Int): Unit
and a def with an implementation (Defn.Def)
scala> q"def add(a: Int, b: Int) = a + b" // Defn.Def
res0: meta.Defn.Def = def add(a: Int, b: Int) = a + b

Fortunately, most of the time you won't need to worry about this. Quasiquotes help you create/match/compose/deconstruct the correct instances. However, occasionally you may need to debug the types of the trees you have.

For your convenience, I've compiled together the most common types in this handy diagram:

Macro annotations

Note, Scalameta-based macro annotations will soon be superseded by, see also Scalameta-based macro annotations have several limitations such as not working with

In addition, efforts to support def macros and Dotty have been moved to scalamacros/scalamacros. New scalameta/paradise bug reports or feature requests will not be addressed. New scalameta/paradise pull requests to fix known issues will continue to be reviewed and we are happy to cut new releases with contributed fixes.

Setup build

It's possible to write macro annotations on Scalameta trees using the Scalameta paradise compiler plugin. To configure the Scalameta paradise plugin, you need to enable it in your build for both the projects that define macro annotation and the projects that use macro annotations
lazy val macroAnnotationSettings = Seq(
  addCompilerPlugin("org.scalameta" % "paradise" % "3.0.0-M10" cross CrossVersion.full),
  scalacOptions += "-Xplugin-require:macroparadise",
  scalacOptions in (Compile, console) ~= (_ filterNot (_ contains "paradise")) // macroparadise plugin doesn't work in repl yet.
// Requires scalaVersion 2.11.11 or 2.12.3
lazy val projectThatDefinesMacroAnnotations = project.settings(
  libraryDependencies += "org.scalameta" %% "scalameta" % "1.8.0" % Provided,
  // ... your other project settings
lazy val projectThatUsesMacroAnnotations = project.settings(
  // ... your other project settings
These settings are already configured in the tutorial repo. Once you are setup, run


Hello World

Here is an example macro annotation:
package scalaworld.macros

import scala.meta._
import scala.collection.immutable.Seq

class Main extends scala.annotation.StaticAnnotation {
  inline def apply(defn: Any): Any = meta {
    defn match {
      case q"object $name { ..$stats }" =>
        MainMacroImpl.expand(name, stats)
      case _ =>
        abort("@main must annotate an object.")

// This is an example how we can refactor the macro implementation into a utility
// function which can be used for unit testing, see MainUnitTest.
object MainMacroImpl {
  def expand(name: Term.Name, stats: Seq[Stat]): Defn.Object = {
    val main = q"def main(args: Array[String]): Unit = { ..$stats }"
    q"object $name { $main }"
The annotation wraps the body of an object into a main function, serving a similar function as extending App.


Implement a Class2Map macro annotation that injects a toMap method that creates a Map[String, Any] from the fields of this class.


package scalaworld.macros

import scala.collection.immutable.Seq
import scala.meta._

// Before:
// @Class2Map
// class Class2MapExample(a: Int, b: String)(c: List[Int]) {
// After:
// class Class2MapExample(a: Int, b: String)(c: List[Int]) {
//   def toMap: _root_.scala.collection.Map[String, Any] =
//     _root_.scala.collection.Map(("a", a), ("b", b), ("c", c))
// }

class Class2Map extends scala.annotation.StaticAnnotation {
  inline def apply(defn: Any): Any = meta {
    defn match {
      case cls @ Defn.Class(_, _, _, Ctor.Primary(_, _, paramss), template) =>
        val namesToValues: Seq[Term.Tuple] = { param =>
          q"(${}, ${Term.Name(})"
        val toMapImpl: Term =
          q"_root_.scala.collection.Map[String, Any](..$namesToValues)"
        val toMap =
          q"def toMap: _root_.scala.collection.Map[String, Any] = $toMapImpl"
        val templateStats: Seq[Stat] = toMap +: template.stats.getOrElse(Nil)
        cls.copy(templ = template.copy(stats = Some(templateStats)))
      case _ =>
        abort("@Class2Map must annotate a class.")


Implement a WithApply macro annotation that creates a apply method to construct an instance of the class (just like is created for case classes).

The challenge here is to handle the companion object correctly.


package scalaworld.macros

import scala.collection.immutable.Seq
import scala.meta._

// Before:
// @WithApply
// class WithApplyExample(a: Int)(b: String)
// After:
// class WithApplyExample(a: Int)(b: String)
// object WithApplyExample {
//   def apply(a: Int)(b: String): WithApplyExample = new WithApplyExample(a)(b)
// }

class WithApply extends scala.annotation.StaticAnnotation {
  inline def apply(defn: Any): Any = meta {
    def createApply(name: Type.Name, paramss: Seq[Seq[Term.Param]]): Defn.Def = {
      val args = => Term.Name(
      q"""def apply(...$paramss): $name =
            new ${Ctor.Ref.Name(name.value)}(...$args)"""
    defn match {
      // companion object exists
      case Term.Block(
          Seq(cls @ Defn.Class(_, name, _, ctor, _),
              companion: Defn.Object)) =>
        val applyMethod = createApply(name, ctor.paramss)
        val templateStats: Seq[Stat] =
          applyMethod +: companion.templ.stats.getOrElse(Nil)
        val newCompanion = companion.copy(
          templ = companion.templ.copy(stats = Some(templateStats)))
        Term.Block(Seq(cls, newCompanion))
      // companion object does not exists
      case cls @ Defn.Class(_, name, _, ctor, _) =>
        val applyMethod = createApply(name, ctor.paramss)
        val companion   = q"object ${Term.Name(name.value)} { $applyMethod }"
        Term.Block(Seq(cls, companion))
      case _ =>
        abort("@WithApply must annotate a class.")


Open up Debug.scala and implement a Debug macro annotation for methods that:


package scalaworld.macros

import scala.annotation.compileTimeOnly
import scala.meta._

// Before:
// @Debug
// def complicated(a: Int, b: String)(c: Int): Int = {
//   Thread.sleep(500)
//   a + b.length + c
// }
// After:
// def complicated(a: Int, b: String)(c: Int): Int = {
//   {
//     println("a" + ": " + a)
//     println("b" + ": " + b)
//     println("c" + ": " + c)
//   }
//   val start = System.currentTimeMillis()
//   val result = {
//     Thread.sleep(500)
//     a + b.length + c
//   }
//   val elapsed = System.currentTimeMillis() - start
//   println("Method " + "complicated" + " ran in " + elapsed + "ms")
//   result
// }
class Debug extends scala.annotation.StaticAnnotation {
  import autocomplete._
  inline def apply(defn: Any): Any = meta {
    defn match {
      case defn: Defn.Def =>
        val printlnStatements = { param =>
                ${} + ": " +
        val body: Term = q"""
          { ..$printlnStatements }
          val start =
          val result = ${defn.body}
          val elapsed = - start
          println("Method " + ${} + " ran in " + elapsed + "ms")
        defn.copy(body = body)
      case _ =>
        abort("@Debug most annotate a def")

For extra credit:


Implement a generic macro annotation to automatically derive a shapeless Generic[T] instance.

Note, macro annotations are purely syntactic, see How do I get the type of a tree?. As a result, to find subclasses of a sealed trait, we depend on the assumption that all the subclasses are put under the companion class of the sealed trait. The implementation below looks inside the companion class and extracts definitions of classes which extend the sealed trait.


package scalaworld.macros

import scala.collection.immutable.Seq
import scala.meta._

// Before:
// @generic
// case class Foo(i: Int, s: String)
// @generic
// sealed trait Bar
// object Bar {
//   case class Baz(i: Int)     extends Bar
//   case class Quux(s: String) extends Bar
// }
// After:
// // infix operators are used where possible, avoiding the syntax ::[A, B]
// case class Foo(i: Int, s: String)
// object Foo {
//   implicit val FooGeneric: _root_.shapeless.Generic[Foo] =
//     new _root_.shapeless.Generic[Foo] {
//       import shapeless._
//       type Repr = Int :: String :: HNil
//       def from(r: Repr): Foo = r match {
//         case i :: s :: HNil => new Foo(i, s)
//       }
//       def to(t: Foo): Repr = t.i :: t.s :: HNil
//     }
// }
// sealed trait Bar
// object Bar {
//   implicit val BarGeneric: _root_.shapeless.Generic[Bar] =
//     new _root_.shapeless.Generic[Bar] {
//       import shapeless._
//       type Repr = Baz :+: Quux :+: CNil
//       def from(r: Repr): Bar = r match {
//         case Inl(t)      => t
//         case Inr(Inl(t)) => t
//         case Inr(Inr(cnil)) => cnil.impossible
//       }
//       def to(t: Bar): Repr = t match {
//         case t: Baz  => Inl(t)
//         case t: Quux => Inr(Inl(t))
//       }
//     }
//   case class Baz(i: Int) extends Bar()
//   case class Quux(s: String) extends Bar()
// }

// This implementation contains quite a bit of boilerplate because we
// generate similar code in term, type and pattern position.
class generic extends scala.annotation.StaticAnnotation {
  inline def apply(defn: Any): Any = meta {
    defn match {
      // Sealed ADT, create coproduct Generic.
      case Term.Block(
          Seq(t @ ClassOrTrait(mods, name), companion: Defn.Object))
          if GenericMacro.isSealed(mods) =>
        val oldTemplStats = companion.templ.stats.getOrElse(Nil)
        val subTypes = oldTemplStats.collect {
          case t: Defn.Class if GenericMacro.inherits(name)(t) => t
        val newStats =
          GenericMacro.mkCoproductGeneric(name, subTypes) +: oldTemplStats
        val newCompanion =
          companion.copy(templ = companion.templ.copy(stats = Some(newStats)))
        Term.Block(Seq(t, newCompanion))
      // Plain class with companion object, create HList Generic.
      case Term.Block(
          Seq(cls @ Defn.Class(_, name, _, ctor, _),
              companion: Defn.Object)) =>
        val newStats =
          GenericMacro.mkHListGeneric(name, ctor.paramss) +:
        val newCompanion =
          companion.copy(templ = companion.templ.copy(stats = Some(newStats)))
        Term.Block(Seq(cls, newCompanion))
      // Plain class without companion object, create HList Generic.
      case cls @ Defn.Class(_, name, _, ctor, _) =>
        val companion =
          q"""object ${Term.Name(name.value)} {
                ${GenericMacro.mkHListGeneric(name, ctor.paramss)}
        Term.Block(Seq(cls, companion))
      case defn: Tree =>
        abort("@generic must annotate a class or a sealed trait/class.")

object GenericMacro {
  def mkCoproductTerm(depth: Int): Term =
    if (depth <= 0) q"Inl(t)"
    else q"Inr(${mkCoproductTerm(depth - 1)})"

  def mkCoproductPattern(depth: Int): Pat =
    if (depth <= 0) p"Inl(t)"
    else p"Inr(${mkCoproductPattern(depth - 1)})"

  // final unreachable case in `from` for coproduct generic.
  def mkCantHappen(depth: Int): Pat =
    if (depth <= 0) p"Inr(cnil)"
    else p"Inr(${mkCantHappen(depth - 1)})"

  def mkGeneric(name: Type.Name,
                repr: Type,
                to: Term,
                from: Seq[Case],
                importStat: Stat): Stat = {
    val reprTyp: Stat = q"type Repr = $repr"
    val toDef: Stat   = q"def to(t: $name): Repr = $to"
    val fromDef: Stat =
      q"def from(r: Repr): $name = r match { $from }"
    val implicitName = Pat.Var.Term(Term.Name(name.syntax + "Generic"))

    q"""implicit val $implicitName: _root_.shapeless.Generic[$name] =
            new _root_.shapeless.Generic[$name] {

  def mkCoproductGeneric(superName: Type.Name,
                         subTypes: Seq[Defn.Class]): Stat = {
    val coproductType: Type = subTypes.foldRight[Type](t"CNil") {
      case (cls, accum) =>
        t"${} :+: $accum"
    val coproductTermCases: Seq[Case] = {
      case (cls, i) =>
        p"case t: ${} => ${mkCoproductTerm(i)}"
    val coproductTerm = q"t match { $coproductTermCases }"
    val coproductPat: Seq[Case] = {
      case (cls, i) =>
        p"case ${mkCoproductPattern(i)} => t"
    val cantHappen =
      p"""case ${mkCantHappen(subTypes.length - 1)} =>
              coproductPat :+ cantHappen,
              q"import shapeless.{CNil, :+:, Inr, Inl}")

  def mkHListGeneric(name: Type.Name, paramss: Seq[Seq[Term.Param]]): Stat = {
    val params = paramss match {
      case params :: Nil => params
      case _             => abort("Can't create generic for curried functions yet.")
    val hlistType: Type = params.foldRight[Type](t"HNil") {
      case (Term.Param(_, _, Some(decltpe: Type), _), accum) =>
        t"$decltpe :: $accum"
      case (param, _) =>
        abort(s"Unsupported parameter ${param.syntax}")
    val hlistTerm: Term = params.foldRight[Term](q"HNil") {
      case (param, accum) =>
        q"t.${Term.Name(} :: $accum"
    val hlistPat: Pat = params.foldRight[Pat](q"HNil") {
      case (param, accum) =>
        p"${Pat.Var.Term(Term.Name(} :: $accum"
    val args = => Term.Name(
    val patmat =
      p"case $hlistPat => new ${Ctor.Ref.Name(name.value)}(..$args)"
              q"import shapeless.{::, HNil}")

  def isSealed(mods: Seq[Mod]): Boolean = mods.exists(_.syntax == "sealed")

  // Poor man's semantic API, we check that X in `class Foo extends X`
  // matches syntactically the name of the annotated sealed type.
  def inherits(superType: Type.Name)(cls: Defn.Class): Boolean =
    cls.templ.parents.headOption.exists {
      case q"$parent()" => parent.syntax == superType.syntax
      case _            => false

object ClassOrTrait {
  def unapply(any: Defn): Option[(Seq[Mod], Type.Name)] = any match {
    case t: Defn.Class => Some((t.mods,
    case t: Defn.Trait => Some((t.mods,
    case _             => None

Testing macro annotations

See MainTest for an example of to both unit test and integration test a macro annotation.
package scalaworld.macros

import scala.meta._
import scala.meta.testkit._
import org.scalatest.FunSuite

// If you are doing complicated macro expansions, it's recommeded to unit test
// the trickiest bits instead of relying only on integration tests.
class MainUnitTest extends FunSuite {

  // TODO(olafur) this method should be exposed in testkit
  def assertStructurallyEqual(obtained: Tree, expected: Tree): Unit = {
    StructurallyEqual(obtained, expected) match {
      case Left(AnyDiff(x, y)) =>
        fail(s"""Not Structurally equal!:
                |obtained: $x
                |expected: $y
      case _ =>

  test("@Main creates a main method") {
    val obtained = MainMacroImpl.expand(q"AnswerToEverything",
                                        List(q"val x = 42", q"println(x)"))
    val expected =
        object AnswerToEverything {
          def main(args: Array[String]): Unit = {
            val x = 42
    assertStructurallyEqual(obtained, expected)

// This is an integration tests because it requires running the macro expansion
// through the entire compiler pipeline, if you have a bug in your macro annotation
// the expanded code may not compile causing your test suite to not compile.
class MainIntegrationTest extends FunSuite {
  test("@Main creates a main method") {
    val out: ByteArrayOutputStream = new ByteArrayOutputStream()
    Console.withOut(new PrintStream(out)) {
    assert(out.toString.stripLineEnd == "Hello Scalameta macros!")

Semantic API

The Scalameta Semantic API offers operations to query information from the Scala compiler such as naming resolution (println => _root_.scala.Predef.println), inferred type annotations, reported compiler messages and more. These operations can for example be used by tools like scalafix to refactor Scala code. The Semantic API cannot be used from Macro annotations.


The Semantic API is based on the concept of mirrors. In Scalameta parlance, a mirror is an entity that encapsulates a compilation context, providing a classpath and a sourcepath to perform semantic operations. Most Semantic APIs take an implicit parameter of type Mirror.

Unlike the syntactic API, which is implemented completely in-house, Scalameta delegates implementations of Mirror to external projects. In Scala, performing even the simplest semantic operations requires a full-blown typechecker, so implementing even a simple Mirror in-house would require us to reinvent a Scala typechecker, which is a multi-man-year effort. Currently, we provide a Mirror implementation that is backed by the Scala 2.x compiler, see Scalahost. There are plans to implement mirrors for Dotty and IntelliJ.

Semantic DB

One of the properties of semantic databases is their portability. Unlike typical representations of semantic information in Scala, semantic databases are not tied to a particular implementation of the Scala typechecker. This makes it possible for metaprograms written against the Scalameta Semantic API to run on multiple platforms.

Another important property is persistence. Since semantic databases are portable, they can be created and consumed in separate environments. This is a key insight that we promote in Scalameta, and we are confident that it will revolutionize the developer tool ecosystem in Scala.

Currently, a typical approach to semantic developer tools in Scala is implementing them as compiler plugins and then using them inside builds. Apart from being a hassle to configure, this approach is also quite slow, because it needs to run a Scala typechecker every time when a tool is invoked.

Scalameta enables a much more convenient workflow. First, we use our scalahost compiler plugin to typecheck a given codebase and generate a semantic database for it. This is done only once per unique snapshot of the codebase. Afterwards, using the persisted semantic database, we can launch any number of developer tools any number of times without having to typecheck the codebase again.

The storage format used for the Semantic DB is defined using Protocol Buffers, or "protobuf" for short. The Semantic DB protobuf schema is small, at the time of this writing it is ~50 lines of code. The full schema is available here. Files containing Semantic DB binary data use the .semanticdb file extension by convention.


To build a Semantic DB with the Scala 2.x compiler, you will need the "Scalahost" compiler plugin. The Scalahost compiler plugin is developed in the main Scalameta repository. There are several ways to integrate Scalahost into your build.


The required setup for the Semantic API is more involved than for the Syntactic API (parsing/tokenizing). Most importantly, we first need to compile source files with the Scala compiler and Scalahost compiler plugin in order to collect information such as types, symbols and more. While it's possible to tokenize or parse code that doesn't compile, a source file must successfully typecheck in order for it to be analyzed with the Semantic API.


For a plug and play example repository, clone sbt-semantic-example and run sbt app/run


lazy val scalahostSettings = Seq(
  addCompilerPlugin("org.scalameta" % "scalahost" % "1.8.0" cross CrossVersion.full),
  scalacOptions ++= Seq(
Add the following settings your project like this
lazy val analyzeMe = project.settings(
  // Scalahost supports the Scala versions: 2.11.11, 2.12.3.
  // Note, the Scala version must match down to the latest (PATCH) number.
  scalaVersion := "2.11.11",

Other build tools

You can use Scalahost outside of sbt by passing in custom scalac flags.
// -Xplugin-require:scalahost tells scalac to fail fast if Scalahost failed to load
// -Yrangepos is required by Scalahost
scalac -Xplugin:/path/to/scalahost.jar -Yrangepos -Xplugin-require:scalahost Code.scala

scalameta/tutorial semantic/run

If you have cloned the repo for this tutorial as explained in Tutorial, you can execute sbt semantic/run to run an example application using the semantic API.


If you have coursier installed, you can run this one-liner assuming you want to analyze a source file Foo.scala
// make sure the version of scalac is 2.11.11
scalac -Yrangepos -Xplugin-require:scalahost -Xplugin:$(coursier fetch --intransitive org.scalameta:scalahost_2.11.11:1.8.0) Foo.scala

Verify installation

To verify that the Scalahost compiler plugin is installed If you cannot find any .semanticdb files, then something went wrong with the setup. Don't hesitate to ask questions on Join the chat at


Let's take a look at the inside of a .semanticdb file. Given a source file Input.scala

package scalaworld.semantic

case class Input(x: Int) {
  def +(other: Input) = Input(x + other.x)
  def +(other: Int) = Input(x + other)
  1 + 2
  List(x).map(num => Input(num + 1) + Input(2) + 3)

the pretty printed representation of its corresponding Input.semanticdb is the following:


[8..18): scalaworld => _root_.scalaworld.
[19..27): semantic => _root_.scalaworld.semantic.
[40..45): Input => _root_.scalaworld.semantic.Input#
[46..47): x => _root_.scalaworld.semantic.Input#(x)
[49..52): Int => _root_.scala.Int#
[62..63): + => _root_.scalaworld.semantic.Input#`+`(Lscalaworld/semantic/Input;)Lscalaworld/semantic/Input;.
[64..69): other => _root_.scalaworld.semantic.Input#`+`(Lscalaworld/semantic/Input;)Lscalaworld/semantic/Input;.(other)
[71..76): Input => _root_.scalaworld.semantic.Input#
[80..85): Input => _root_.scalaworld.semantic.Input.
[86..87): x => _root_.scalaworld.semantic.Input#(x)
[88..89): + => _root_.scala.Int#`+`(I)I.
[90..95): other => _root_.scalaworld.semantic.Input#`+`(Lscalaworld/semantic/Input;)Lscalaworld/semantic/Input;.(other)
[96..97): x => _root_.scalaworld.semantic.Input#(x)
[105..106): + => _root_.scalaworld.semantic.Input#`+`(I)Lscalaworld/semantic/Input;.
[107..112): other => _root_.scalaworld.semantic.Input#`+`(I)Lscalaworld/semantic/Input;.(other)
[114..117): Int => _root_.scala.Int#
[121..126): Input => _root_.scalaworld.semantic.Input.
[127..128): x => _root_.scalaworld.semantic.Input#(x)
[129..130): + => _root_.scala.Int#`+`(I)I.
[131..136): other => _root_.scalaworld.semantic.Input#`+`(I)Lscalaworld/semantic/Input;.(other)
[142..143): + => _root_.scala.Int#`+`(I)I.
[148..152): List => _root_.scala.collection.immutable.List.
[153..154): x => _root_.scalaworld.semantic.Input#(x)
[156..159): map => _root_.scala.collection.immutable.List#map(Lscala/Function1;Lscala/collection/generic/CanBuildFrom;)Ljava/lang/Object;.
[160..163): num => semantic/input/src/main/scala/scalaworld/semantic/Input.scala@160..163
[167..172): Input => _root_.scalaworld.semantic.Input.
[173..176): num => semantic/input/src/main/scala/scalaworld/semantic/Input.scala@160..163
[177..178): + => _root_.scala.Int#`+`(I)I.
[182..183): + => _root_.scalaworld.semantic.Input#`+`(Lscalaworld/semantic/Input;)Lscalaworld/semantic/Input;.
[184..189): Input => _root_.scalaworld.semantic.Input.
[193..194): + => _root_.scalaworld.semantic.Input#`+`(I)Lscalaworld/semantic/Input;.

[140..145): [warning] a pure expression does nothing in statement position; you may be omitting necessary parentheses

_root_.scalaworld.semantic.Input# => case class Input
_root_.scalaworld.semantic.Input#(x) => val param x: Int
_root_.scalaworld.semantic.Input#`+`(I)Lscalaworld/semantic/Input;. => def +: (other: Int)scalaworld.semantic.Input
_root_.scalaworld.semantic.Input#`+`(I)Lscalaworld/semantic/Input;.(other) => param other: Int
_root_.scalaworld.semantic.Input#`+`(Lscalaworld/semantic/Input;)Lscalaworld/semantic/Input;. => def +: (other: scalaworld.semantic.Input)scalaworld.semantic.Input
_root_.scalaworld.semantic.Input#`+`(Lscalaworld/semantic/Input;)Lscalaworld/semantic/Input;.(other) => param other: scalaworld.semantic.Input
_root_.scalaworld.semantic.Input#`<init>`(I)V. => primaryctor <init>: (x: Int)scalaworld.semantic.Input
semantic/input/src/main/scala/scalaworld/semantic/Input.scala@160..163 => param num: Int

[152..152) [Int]
[159..159) [scalaworld.semantic.Input, List[scalaworld.semantic.Input]]
[197..197) (scala.collection.immutable.List.canBuildFrom[scalaworld.semantic.Input])

That is a lot of output, let's take a closer look at each of the sections: Names, Messages, Denotations, and Sugars.



The Names section is a map from names/identifiers to the Symbol they reference. A name is an identifier such as List or +. A name can have different symbols depending on where the name appears. For example, + references three different symbols in the example above

// Int + Int
[88..89): +   => _root_.scala.Int#`+`(I)I.
// Bar + Bar
[62..63): +   => _root_.scalaworld.semantic.Input#`+`(Lscalaworld/semantic/Input;)Lscalaworld/semantic/Input;.
// Bar + Int
[105..106): + => _root_.scalaworld.semantic.Input#`+`(I)Lscalaworld/semantic/Input;.


The Denotations section is a map from Symbols that appear in the source file to the Denotation of the symbol.


The Messages section is a list of the reporter compiler messages (info/warn/error). Note, some reported messages like deprecation warnings are not yet collected, see #759.


EXPERIMENTAL/NEW. The Sugars section is a list "sugars" that are inserted by the compiler. Sugars don't have positions in the original source file. Sugars include inferred type parameters, type annotations of definitions and implicit arguments. Note that sugars are currently not enriched with semantic information such as symbols or denotations, see #822.


Every definition such as a class/trait/val/var/def/object has a unique identifier. This unique identifier is called a Symbol. The string representation of a symbol is the erased JVM signature of the definition. This makes it possible to distinguish between overloaded methods. To understand what a symbol means, inspect its Denotation.


A denotation represents a Symbol's meaning, including modifiers, annotations, visibility restrictions and type signature. A denotation can for example tell you if a definition is a trait or a class, of if it has the implicit modifier.

Call for contributors!

The Scalameta semantic API is very young, so we need your help to get it mature as fast as possible.

The main area for contributions is ensuring that we can generate semantic databases for even the trickiest snippets of Scala code. We would greatly appreciate if you could add the `sbt-scalahost` plugin to your sbt build, generate a semantic database of your project and report back at Join the chat at


Scalameta contrib is a module that provides common utilities for handling Scalameta data structures.

To use contrib, import scala.meta.contrib._.

Contrib exposes some collection-like methods on Tree.

scala> source"""
class A
trait B
object C
object D
res0: Option[scala.meta.Tree] = Some(object C)

scala> source"""
class A
trait B
object C {
  val x = 2
  val y = 3
object D
""".collectFirst { case q"val y = $body" =>  body.structure }
res1: Option[String] = Some(Lit.Int(3))

scala> source"""
class A
trait B
object C {
  val x = 2
  val y = 3
object D
res2: Boolean = false

Contrib has a Equal typeclass for comparing trees by structural or syntactic equality.

scala> q"val x = 2".isEqual(q"val x = 1")
res0: Boolean = false

scala> (q"val x = 2": Stat).isEqual("val x = 2 // comment".parse[Stat].get)
res1: Boolean = true

scala> (q"val x = 2": Stat).isEqual[Syntactically]("val x = 2 // comment".parse[Stat].get)
res2: Boolean = false

scala> q"lazy val x = 2".mods.exists(_.isEqual(mod"lazy"))
res3: Boolean = true

scala> q"lazy val x = 2".contains(q"3")
res4: Boolean = false

scala> q"lazy val x = 2".contains(q"2")
res5: Boolean = true

Contrib has an AssociatedCommments helper to extract leading and trailing comments of tree nodes.

scala> val code: Source = """
/** This is a docstring */
trait MyTrait // leading comment
code: scala.meta.Source =

/** This is a docstring */
trait MyTrait // leading comment

scala> val comments = AssociatedComments(code)
comments: scala.meta.contrib.AssociatedComments = scala.meta.contrib.AssociatedComments$$anon$1@406ace1a

scala> val myTrait = code.find([Defn.Trait]).get
myTrait: scala.meta.Tree = trait MyTrait

scala> comments.leading(myTrait) -> comments.trailing(myTrait)
res0: (Set[meta.tokens.Token.Comment], Set[meta.tokens.Token.Comment]) = (Set(/** This is a docstring */),Set(// leading comment))


How do I get the type of a tree?

It is not possible to query for type information from a macro annotation. Macro annotations are purely syntactic, and there is no plan to add support for capabilities such as getting the fields/member of a type/class. Why? Because you hit on chicken vs. egg problems. There is a cyclic dependency between The generated definitions are necessary to type check the source file, and the type checker is necessary provide type information to the macro annotation.

Where can I ask more questions?

How do I test a macro annotation?

You can either unit test or integration test your macro annotation. See Testing macro annotations

How do identify a particular annotation?

Maybe this help,

What is the quasiquote for X?

Here is an overview of quasiquote syntax:

';' expected but 'def' found. inline def apply

Be sure that the org.scalameta:paradise compiler plugin is enabled.

Can I use Scalameta with Scala.js?

Yes, the main Scalameta modules support Scala.js.

How do I pass an argument to the macro annotation?

You match on this as a Scalameta tree. For example:
package scalaworld.macros

import scala.meta._

class Argument(arg: Int) extends scala.annotation.StaticAnnotation {
  inline def apply(defn: Any): Any = meta {
    // `this` is a scala.meta tree.
    val arg = this match {
      // The argument needs to be a literal like `1` or a string like `"foobar"`.
      // You can't pass in a variable name.
      case q"new $_(${Lit.Int(arg)})" => arg
      // Example if you have more than one argument.
      case q"new $_(${Lit.Int(arg)}, ${Lit.String(foo)})" => arg
      case _  => ??? // default value
    println(s"Arg is $arg")

Do I need to depend on Scalameta at runtime?

No. But your project needs a dependency on Scalameta. If you only use Scalameta at compile time, you can mark the dependency as % "provided" to exclude it from your runtime application.

How do I use macro annotations provided by a third-party library?

If your project depends on a library that provides macro annotations, you need to enable the `paradise` compiler plugin and declare a dependency on `scala-meta` so that macro annotations could be expanded:
  ("org.scalameta" % "paradise" % paradiseVersion).cross(CrossVersion.full)

libraryDependencies +=
  "org.scalameta" %% "scalameta" % scalametaVersion % Provided

Here is a complete `settings` definition necessary and sufficient to enable dependent project to use the library (including workarounds for features that are being currently worked on):
lazy val enableMacroAnnotations: Seq[Def.Setting[_]] = Seq(
  addCompilerPlugin("org.scalameta" % "paradise" % paradiseVersion cross CrossVersion.full),
  libraryDependencies += "org.scalameta" %% "scalameta" % scalametaVersion % Provided,
  scalacOptions += "-Xplugin-require:macroparadise",
  // macroparadise plugin doesn't work in repl yet.
  scalacOptions in (Compile, console) ~= (_ filterNot (_ contains "paradise"))

How do I reuse code between macros?

If you try to call a method inside you macro class you get a "X not found" error.
class Argument(arg: Int) extends scala.annotation.StaticAnnotation {
  def helper(t: Any): Stat = ??? // utility method
  inline def apply(defn: Any): Any = meta {
    helper(defn) // ERROR: `helper` not found
    // Why? `this` is a Scalameta tree.
You can move the utility method to an external object.
package scalaworld.macros

import scala.meta._

object MacroUtil {
  def helper(defn: Any): Stat = q"class ReuseExample"

class Reuse extends scala.annotation.StaticAnnotation {
  inline def apply(defn: Any): Any = meta {

Incremental compiler is messing up / stale code

While editing the macro, it can be handy to keep this command running in SBT
~; macros/test:clean ; macros/test:run
Incremental compilation caches the macro expansion you need to clean the test project on every run.

My IDE/IntelliJ shows red squiggly marks

Your IDE might be all red like this

There are two possible workarounds:
  1. (Recommended if using IntelliJ) First, install the 2016.3 EAP. Then, select nightly or EAP update channel from Updates tab in Scala plugin settings Settings -> Languages and frameworks -> Scala -> Updates.
  2. (hack) Add import autocomplete._ to your file and a semicolon after inline, like this

    Remember to remove the semicolon when you run your macro.

New-style def macros

Scalameta doesn't yet provide a possibility to write new-style def macros, but we are working hard on implementing this functionality. Attend Eugene Burmako's talk at Scala eXchange 2016 to learn more about our progress.

Compatibility with traditional macros

At the moment, new-style macros can only take apart existing Scala syntax and generate new syntax (so called syntactic API). This corresponds to the functionality provided by traditional macro annotations that only use tree constructors and quasiquotes.

Even this limited functionality should be enough to port most of the existing macro annotations to Scalameta. Oleksandr Olgashko has ported a large subset of Simulacrum's @typeclass features to new-style macros, so we are confident that new-style macros are powerful enough to support even more complex annotations.

For new-style def macros, we are working on semantic API, which will provide compiler information such as type inference, name resolution and other functionality that requires typechecking. It is too early to tell how compatible this API will be with what is provided by scala.reflect. We will provide more information as the design of the semantic API shapes up.

Which versions of Scala do the Scalameta macros support?

2.11.x and 2.12.x.

I'd like to use the semantic api as a replacement for scalac's presentation compiler, is this doable? intended usage?

It's not doable and it's not the intended usage. The prime application we are focused on at the moment is refactoring with scalafix.

How do I use the Semantic API


Does Scalameta integrate with Zinc in order to achieve the semantic api?

No, Zinc and Scalameta and unrelated. The Scalameta Semantic API is enabled with the scalac compiler plugin Scalahost. Scalahost is designed to accommodate incremental compilation in order to play nicely with Zinc.

What's the status on new macros style with Dotty?


Will newstyle macros would remove the limitation of two tiered compilation?

No. The major focus of the new macro system is portability (same implementation running on scalac + dotty + intellij). See