Skip to content

Conversation

@chrisj
Copy link
Contributor

@chrisj chrisj commented Oct 20, 2025

We would like to add support for colormaps and alpha was mentioned though that would require some rendering logic changes since it would conflict with visible segments. One option would be to force alpha to 0 on all non-visible segments. I will push a deployment with a rudimentary UI for ease of testing later today.

Example:

"segmentPropertyColors": [
  {
    "type": "tag",
    "active": true,
    "map": {
      "L3": "#FFFFFF"
    }
  },
  {
    "type": "numeric",
    "active": true,
    "property": "NSI",
    "options": {
      "min": 5000,
      "max": 7500,
      "minColor": "#FF0000",
      "maxColor": "#0000FF"
    }
  }
],

http://localhost:8080/#!%7B%22dimensions%22:%7B%22x%22:%5B8e-9%2C%22m%22%5D%2C%22y%22:%5B8e-9%2C%22m%22%5D%2C%22z%22:%5B3.3e-8%2C%22m%22%5D%7D%2C%22position%22:%5B336923.5%2C198344.5%2C3314.5%5D%2C%22crossSectionScale%22:31.976459773130287%2C%22projectionOrientation%22:%5B0.2733137607574463%2C-0.049999721348285675%2C-0.28637629747390747%2C0.9169450402259827%5D%2C%22projectionScale%22:149658.33236829395%2C%22layers%22:%5B%7B%22type%22:%22segmentation%22%2C%22source%22:%5B%7B%22url%22:%22gs://h01-release/data/20210601/c3/%7Cneuroglancer-precomputed:%22%2C%22subsources%22:%7B%22default%22:true%2C%22bounds%22:true%2C%22properties%22:true%2C%22mesh%22:true%7D%2C%22enableDefaultSubsources%22:false%7D%2C%22gs://lichtman-h01-49eee972005c8846803ef58fbd36e049/goog14r0s5c3/segment_properties/%7Cneuroglancer-precomputed:%22%5D%2C%22tab%22:%22segments%22%2C%22segments%22:%5B%223823768399%22%2C%22%21101153362960%22%2C%22%213154035698%22%2C%223720480838%22%2C%223808204067%22%2C%223896803064%22%2C%224669328082%22%2C%225293286102%22%2C%225890292881%22%2C%226385207860%22%2C%226445451958%22%2C%226677125078%22%2C%2232443424006%22%2C%2233374997143%22%2C%224200138429%22%2C%2242108919642%22%5D%2C%22segmentQuery%22:%22#L4%20NVx%3E=7806957120.53791%20NVx%3C=20235583190.66803%20NSI%3E=5024%20NSI%3C=7675%22%2C%22segmentColors%22:%7B%223154035698%22:%22#ffffff%22%7D%2C%22segmentPropertyColors%22:%5B%7B%22type%22:%22tag%22%2C%22active%22:true%2C%22map%22:%7B%22L3%22:%22#FFFFFF%22%7D%7D%2C%7B%22type%22:%22numeric%22%2C%22active%22:true%2C%22property%22:%22NSI%22%2C%22options%22:%7B%22min%22:5000%2C%22max%22:7500%2C%22minColor%22:%22#FF0000%22%2C%22maxColor%22:%22#0000FF%22%7D%7D%5D%2C%22name%22:%22c3%22%7D%5D%2C%22showSlices%22:false%2C%22selectedLayer%22:%7B%22size%22:392%2C%22visible%22:true%2C%22layer%22:%22c3%22%7D%2C%22layout%22:%223d%22%2C%22selection%22:%7B%22size%22:693%2C%22visible%22:false%7D%7D

@jbms
Copy link
Collaborator

jbms commented Oct 20, 2025

This is related to #786 and #259

It seems like the most general solution would be to support user-defined shaders for segmentation layers, and make the segment properties available to the shader.

Then you can also use shader UI controls to configure things.

Adding support for shaders to the rendering itself would be relatively straightforward.

The main implementation challenge is that a number of UI elements in Neuroglancer (such as the segment list) need access to the segment colors from JavaScript code. To compute these colors we would either:

  1. Execute the shader on the GPU in a special way, perhaps similar to how the fragment shader tests are implemented, in order to read back the colors. Doing this synchronously is slow so we probably need to do it asynchronously, with caching and batching to make it perform reasonably.
  2. Transpile the GLSL shader to JavaScript, e.g. using https://github.com/stackgl/glsl-transpiler/ and then execute from JavaScript. This would almost certainly be more performant and would basically work as a drop-in replacement to the existing logic in JavaScript for computing colors.

@jbms
Copy link
Collaborator

jbms commented Oct 20, 2025

Note: I think glsl-transpiler only supports GLSL 1, not GLSL 3. However, most of the differences relate to the inputs/outputs of vertex and fragment shaders, and texture access, which aren't relevant here. For this purpose we wouldn't transpile the entire real shader, but a modified shader that just defines the color computation as a function.

If there is something missing that we need it could probably be added to glsl-transpiler fairly easily.

@fcollman
Copy link
Contributor

thanks for the feedback. We had first had chris exploring adding this kind of shader code integration for a mesh shader (he has another branch where he has moved in this direction and it is basically working). The problem with this as you pointed out is is that the 2d views and skeleton default shaders were not following those colors.

I think you are suggesting that we have a "segment color" shader, and then try to feed that back to the javascript, but then we have potentially 3 shaders for segmentation layers.. (skeleton, mesh, segment color). We could also have a voxel shader for segmentation layers, but I don't know what I would do with such a thing. I think we were worried this would create a larger neuroglancer shader API, and it seemed like a design principle we were trying to follow was to limit the scope of the shader API. This is why we thought offering a set of finite controls might be more appealing. But conceptually I agree it is appealing to re-use all the UI infrastructure we have for shader controls to help users define mappings.

I do think forward looking, it would be good to design something that would allow for mesh vertex data to drive the mesh shading (like vertex data can drive skeleton shading now). The framework of having the segment color available to the mesh vertex shader would play nice with this concept as the default shader could be simply to pass that color through to the vertex shader (which could utilize it or override it).

@jbms
Copy link
Collaborator

jbms commented Oct 21, 2025

Yes, I was imagining a segment color shader.

If we want to support volume rendering of segmentation layers in the future, I think we could utilize the same shader for volume rendering and rely on a boolean VOLUME_RENDERING value as in the image shader, and thereby avoid a separate shader.

The skeleton and mesh shaders have to be separate, I think, since they would each have access to different properties.

Supporting a mesh shader wouldn't be too difficult in principle, we just don't have support for vertex attributes at the moment.

@chrisj
Copy link
Contributor Author

chrisj commented Oct 21, 2025

So the segment color shader would get executed in javascript, then should it populate a color map like I did here so it can be used by the sliceview renderer? Or does it get appended to existing segmentation shaders and the javascript execution would be primarily for the segment list (and tab colors).

In order to color directly in the sliceview shader, I believe I would need a texture per segment property.

@jbms
Copy link
Collaborator

jbms commented Oct 21, 2025

Yes, for sliceview rendering the shader would just get used directly, meaning we will need to figure out which segment properties it accesses and make those available to the shader e.g. as a hash table converted to a texture.

The javascript execution would just be for the cases where the color is currently computed in JavaScript. For meshes and skeletons we could compute it either in Javascript, as we do currently, or in WebGL; it would probably be better to continue computing it in javascript.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants