Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Proposal: Add validation without data unmarshalling #31

Open
cachesdev opened this issue Sep 19, 2024 · 4 comments
Open

Proposal: Add validation without data unmarshalling #31

cachesdev opened this issue Sep 19, 2024 · 4 comments

Comments

@cachesdev
Copy link
Contributor

cachesdev commented Sep 19, 2024

Currently. zog handles both data unmarshaling and validation with Parse, which is great when you are using the standard library. whoever, this leads to duplicated unmarshalling in other http frameworks, in my case, echo:

input := make(map[string]any)

// Bind already handles data unmarshalling to structs, but I need to use a map to comply with the zog api (and reduce code duplication).
err := c.Bind(&input)
if err != nil {
	return errutils.BindErrorResponse(c, logger, err)
}

var data struct {
	Email    string `zog:"email"`
	Password string `zog:"password"`
}

// unmarshalling happens again, and you end up with duplicated data.
errMap := z.Struct(z.Schema{
	"email":    z.String().Required().Email(),
	"password": z.String().Required(),
}).Parse(input, &data)
if errMap != nil {
	return c.JSON(http.StatusBadRequest, z.Errors.SanitizeMap(errMap))
}

a more memory and performance efficient code may look like this:

var input struct {
	Email    string `json:"email" zog:"email"`
	Password string `json:"password" zog:"password"`
}

err := c.Bind(&input)
if err != nil {
	return errutils.BindErrorResponse(c, logger, err)
}

errMap := z.Struct(z.Schema{
	"email":    z.String().Required().Email(),
	"password": z.String().Required(),
}).Validate(input) // <- notice the Validate method
if errMap != nil {
	return c.JSON(http.StatusBadRequest, z.Errors.SanitizeMap(errMap))
}

the utility of this is not only limited to this case, it can also be useful when you already have data from somewhere else and you just want to validate it:

headerParts := strings.Split(authorizationHeader, " ")
if len(headerParts) != 2 || headerParts[0] != "Bearer" {
	return errutils.InvalidAuthenticationTokenResponse(c, mw.logger, nil)
}

token := headerParts[1]

// i've got nowhere to unmarshall this.
errList:= z.String().Required().Min(26).Parse(token, ???)

my proposal is to either create a Validate method, or be able to pass nil to Parse. or do both, being able to pass nil to Parse and create a Validate method that wraps it for a better API.

@Oudwins
Copy link
Owner

Oudwins commented Sep 19, 2024

Yes I agree. This is already planned. Probably for v0.10 I have a working prototype in a local branch.

The approach of validating a struct is fine but it does have some limitations:

  1. Preprocessing will not work (for things like transforming one type into another)
  2. You won't be able to model things like having an age param that is nil if not set but can be 0 if set. (to be fair you can't do this with zog currently but it is planned)
  3. You will get worse error messages

Also, v0.9 will come with an update to the zhttp package which will handle json also. I think there is merit to recommending you use the zog implementation because we can introduce future performance improvements during parsing. You will be able to do schema.Parse(zhttp.Request(r), &dest) and it will parse into the destination. This will still under the hood parse the body into a map first but this can be optimized out at a later point.

A part from that I do agree 100% that this needs to be implemented. The prototype I have locally has the api of schema.Validate(&data). There are a few considerations we have to make though, the main being that this will make two conceptually similar functions that will work VERY differently in practice.

For example, Parse right now will check for require based on the input data. That means that the results here are different based on if you use Parse or Validate:

       // imagine this is the json we get from the request & we have a
	s := `
	{
	"name": "hello"
	}
	`
	var o struct {
		Name int `json:"name"`
	}
       oSchema = z.Struct(z.Schema{"name": z.Int().Required()}).Required()
	err := c.Bind(&o) // This will return an error that it cannot parse hello into name
       oSchema.Validate(&o) // this will return an error that o is required but it is 0

       oSchema.Parse(zhttp.Request(c.Request()), &o) // This will return an error failed to coerce "hello" into an integer   

Most of this weird behaviours are actually quite intuitive once you stop to think about them for a bit but it is still the case that one single schema is behaving in two different ways.

Some other concerns:

  • Currently default & catch will just not work. These feel more like parsing concepts but perhaps they should be implemented otherwise it might lead to results that look confusing from the perspective of the user
  • PreTransforms don't make a lot of sense

@cachesdev
Copy link
Contributor Author

since Validate would be depending on the result of a different unmarshaller, it makes sense that it would have different errors in case Bind fails. I can see your point, using zhttp makes more sense in this case (and if it's faster with better errors, then why not)

PreTransforms could make sense, but I can't think of many cases where you would want to do this. as for default and catch, it makes sense to remove them for Validate as it wont be able to modify the object being validated either way

@Oudwins
Copy link
Owner

Oudwins commented Sep 22, 2024

Hey @cachesdev, I have been busy last few days. But I have had some time to think about this and....

I don't excluding features from the schema when calling the Validate method. Because I think it will be confusing for users. Therefore I have decided to basically bite the bullet and implement Validate in such a way that all schema components will work (transforms, defaults, etc). This just means it will take me a little bit longer to get it out since it is more complex than I initially thought.

For now, I would recommend to use the zhttp package if working with json I have added better references to it in the docs and the API should be very nice now. This will all go out in 0.9 which I plan on releasing tonight. It will also include the i18n code you contributed and a few other minor tweaks.

I think I mentioned before but this is what the ergonomics for using the zhttp library will look like now:

errs := userSchema.Parse(zhttp.Request(r), &user)

As to when this will be done I don't know yet but this and pointer support will be my next priorities

@cachesdev
Copy link
Contributor Author

cachesdev commented Sep 22, 2024

I am rewriting my endpoints to use zhttp and everything is working great. great work as always. I do see the benefit of Validate having full support, pretransforms and such may not have a lot of use but I bet there may be an edge case or two where it may be appreciated, as I said before I can't think of many reasons to use it but there could still be reasons to do it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants