gl_common: set glViewport() to maximum supported size
          #388
        
          
      
  Add this suggestion to a batch that can be applied as a single commit.
  This suggestion is invalid because no changes were made to the code.
  Suggestions cannot be applied while the pull request is closed.
  Suggestions cannot be applied while viewing a subset of changes.
  Only one suggestion per line can be applied in a batch.
  Add this suggestion to a batch that can be applied as a single commit.
  Applying suggestions on deleted lines is not supported.
  You must change the existing code in this line in order to create a valid suggestion.
  Outdated suggestions cannot be applied.
  This suggestion has been applied or marked resolved.
  Suggestions cannot be applied from pending reviews.
  Suggestions cannot be applied on multi-line comments.
  Suggestions cannot be applied while the pull request is queued to merge.
  Suggestion cannot be applied right now. Please check back later.
  
    
  
    
Similar to the changes in #349 and as discussed over there, this PR sets the viewport to the maximum supported dimensions in the new OpenGL backend. In contrast to #349, the projection matrix in each shader has to be set only once when setting up the shader program and not for each draw call.
Changes
glViewport()when initializingso we don't have to worry about differently sized textures when
rendering (usually the same as the maximum supported texture size, but
dependend on the driver).
dimensions. Allows using screen coordinates for all vertex positions
without having to keep track of framebuffer dimensions.
glViewport()to queried maximum dimensionsfor each draw call (
glDraw*(),glClear()).Rationale
Instead of keeping track of all framebuffer/texture sizes and updating the viewport and projection matrix for each shader and draw call, we set them up once to "cancel each other out" regardless of the actual buffer dimensions.
glViewport()itself is only responsible for the transformation from NDC to screen space, so setting it to larger-than-screen dimensions has no effect on video memory. Since the maximum dimensions are usually the same as the maximum supported texture sizes and not that much bigger than actual screen sizes (OpenGL Hardware Database) there is no loss in precision to be expected. Initial testing showed that window coordinates are clipped to screen dimensions regardless, so that we aren't even trying to render outside the screen bounds.Related to #382
Having a unified viewport simplifies rendering passes for the dual-filter kawase blur algorithm.