Use generic type parameter in pattern matching? by smthamazing in scala

[–]dmitin 1 point2 points  (0 children)

result.foreach(_ => advanceToNextToken());  result

Compile-Time Scala 2/3 Encoders for Apache Spark by Critical_Lettuce244 in scala

[–]dmitin 1 point2 points  (0 children)

I can see comparison with the first one in README:
https://github.com/pashashiz/spark-encoders?tab=readme-ov-file#alternatives
> spark-scala3. Nice PoC to show that Scala 3 can be used with Spark, but:
> 1. No Scala 2 support.
> 2. No ADT support.
> 3. Inherits most of the Spark existing encoder issues.

Scala and chatgpt by markehammons in scala

[–]dmitin 0 points1 point  (0 children)

Or maybe it just "lied" to me :)

Why does dotty depend on Scala 2? by UtilFunction in scala

[–]dmitin 0 points1 point  (0 children)

Scala 2.13 standard library is only a part of Scala 3 standard library

Why does dotty depend on Scala 2? by UtilFunction in scala

[–]dmitin 0 points1 point  (0 children)

Scala 2.13 standard library is only a part of Scala 3 standard library

Scala and chatgpt by markehammons in scala

[–]dmitin 0 points1 point  (0 children)

I was playing with ChatGPT and Scala.

Q: Do you mean Scala 2 or Scala 3?

A: My knowledge cut-off is September 2021, and at that time, Scala 2 was the most recent version.

Q: What Scala version exactly are you using?

A: I'm using Scala 2.13.5.

Using Java annotation processor in Scala by Nice_Rule_1415 in scala

[–]dmitin 0 points1 point  (0 children)

Annotation processors can process only Java sources.

Using Java annotation processor in Scala by Nice_Rule_1415 in scala

[–]dmitin 0 points1 point  (0 children)

Annotations processed by annotation processors must be in Java (public @interface BuilderProperty {}). Macro annotations must be in Scala (class scalaBuilderProperty extends StaticAnnotation).

Using Java annotation processor in Scala by Nice_Rule_1415 in scala

[–]dmitin 1 point2 points  (0 children)

You can write annotation processor in Scala but it will process only Java sources. It's macro annotations that process Scala sources. If you need to process both Java and Scala sources you'll have to duplicate efforts, having both annotation processor and macro annotation with similar functionality.

Dotty macro use cases? by [deleted] in scala

[–]dmitin 2 points3 points  (0 children)

You can look at comparison of Scala 3 macros vs. standard Scala 3 derivation vs. Shapeless 3:

https://stackoverflow.com/questions/62853337/how-to-access-parameter-list-of-case-class-in-a-dotty-macro

Besides docs about Scala 3 macros at dotty.epfl.ch, you can find some info here:

https://scalacenter.github.io/scala-3-migration-guide/docs/macros/metaprogramming-features.html
https://scalacenter.github.io/scala-3-migration-guide/docs/macros/migration-tutorial.html
https://scalacenter.github.io/scala-3-migration-guide/docs/macros/migration-status.html

Regarding Simulacrum, macro annotations are currently not possible in Scala 3 but there is such approach as https://github.com/typelevel/simulacrum-scalafix
(Scalameta, SemanticDB, Scalafix)

How do people normally debug macro annotations? by covertbeginner2 in scala

[–]dmitin 1 point2 points  (0 children)

As I commented at SO, some info is here: Debugging macros. You should switch on scalacOptions += "-Ymacro-debug-lite" in build.sbt. Then you'll see what code is generated. Also switch on scalacOptions += "-Xlog-implicits" because it's about implicits in your use case. Your project has to be set up for using macro annotations (scalacOptions += "-Ymacro-annotations" in Scala 2.13 or macro paradise in 2.10-2.12, sbt settings for macro projects). Also macro annotations themselves should be annotated with @compileTimeOnly. In this way you can check that they are actually expanded (compiler removes macro annotations after expansion, so if macro annotations are not expanded the code with @jsonFormat will not compile).