Skip to content

The custom scalars do not parse underlying model  #991

@filosganga

Description

@filosganga

Hello, I am playing with the custom scalars, and I have found something weird. Assuming I have a scalar like this one:

    val LocalDateType = ScalarType[LocalDate](
      "LocalDate",
      coerceOutput = { (d, caps) =>
        if (caps.contains(DateSupport))
          new Date(d.atStartOfDay.toInstant(ZoneOffset.UTC).toEpochMilli)
        else
          d.format(
            DateTimeFormatter.ISO_LOCAL_DATE
          )
      },
      coerceUserInput = {
        case s: String =>
          Either
            .catchNonFatal(
              LocalDate.parse(s, DateTimeFormatter.ISO_LOCAL_DATE)
            )
            .leftMap(_ => LocalDateCoercionViolation)
        case n: Long =>
          LocalDate.ofEpochDay(n).asRight[LocalDateCoercionViolation.type]
        case _ => Left(LocalDateCoercionViolation)
      },
      coerceInput = {
        case ast.StringValue(s, _, _, _, _) =>
          Either
            .catchNonFatal(
              LocalDate.parse(s, DateTimeFormatter.ISO_LOCAL_DATE)
            )
            .leftMap(_ => LocalDateCoercionViolation)
        case ast.IntValue(n, _, _) =>
          LocalDate
            .ofEpochDay(n.toLong)
            .asRight[LocalDateCoercionViolation.type]
        case _ => Left(LocalDateCoercionViolation)
      }
    )

And a schema like so:

val schemaSdl = graphql"""
    scalar LocalDate

    type Customer {
      id: ID!
      givenName: String!
      familyName: String!
      birthday: LocalDate!,
      memberOf: [Membership!]!
    }

    type Account  {
      id: ID!
      members: [Membership!]!
    }

    type Membership {
      account: Account!
      customer: Customer!
    }

    type Query {
      customer(id: ID!): Customer
    }

    schema {
      query: Query
    }
  """

val schema = Schema.buildFromAst[Context](
      schemaSdl,
      AstSchemaBuilder.resolverBased(
        InstanceCheck.field[Context, Json],
        ScalarResolver[Context] {
          case ScalarTypeDefinition("LocalDate", _, _, _, _) => LocalDateType
        },
        // ...
        FieldResolver[Context] {
          case (AstSchemaBuilder.TypeName("Query"), field)
              if field.name == "customer" =>
            _ =>
              Future(
                Json.obj(
                  "id" -> Json.fromString("CUS-123"),
                  "givenName" -> Json.fromString("Filippo"),
                  "familyName" -> Json.fromString("De Luca"),
                  "birthday" -> Json.fromString("02/08/1908"),
                )
              )
        },
        // ...
        FieldResolver.defaultInput[Context, Json]
      )
    )

When the FutureResolver tries to resolve the birthday value, it fail with ClassCastException because the Scalar coerceOutput expects a LocalData, however, under the hood our model is Json.

Then I notice the extractScalar function is ResolverBasedAstSchemaBuilder:

private def extractScalar[In](t: ScalarType[_], value: In)(implicit iu: InputUnmarshaller[In]) = {
    val coerced = iu.getScalarValue(value)

    t match {
      case BooleanType =>
        coerced match {
          case v: Boolean => v
          case v: String => safe(v.toBoolean, "Boolean", value)
          case _ => invalidType("Boolean", value)
        }
      case StringType =>
        coerced.toString
      case IDType =>
        coerced match {
          case s: String => s
          case _ => invalidType("ID", value)
        }
      // ...
      case _ => coerced
    }
  }

That in my opinion should use coerceInput or coerceUserInput of the scalar before returning the coerced (that is not really coerced). As this is what it does with the other built-in types: it returns Long, BigInt, BigDecimal and so on, while in case of a custom scalar, returns the value taken from the InputUnmarshaller as-is

I have tried changing the last case like this:

case _ => t.coerceUserInput(coerced).fold(v => throw new Exception(v.errorMessage), identity)

And does the trick, however, I am not sure if it should really use coerceInput instead

Making the above change, all the tests are green 🟢

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions