I wanted to look into rendering medical images in three.js and it turned out to be surprisingly easy. Using react three fiber and cornerstone.js the basic process is to parse a dicom file with cornerstone.js, normalise the pixel intensity values then create a DataTexture of the image
Setting the Scene
Setup a basic scene in react three fiber, we'll create a plane mesh which we'll be attaching our texture to and point light to illuminate the scene.
<Canvas>
<color attach="background" args={["#fdfcf9"]} />
<camera position={[0, 0, -30]} />
<ambientLight />
<pointLight position={[10, 10, 10]} />
<mesh position={[0, 0, 0]}>
<planeGeometry args={[5, 5, 1, 1]} />
<meshBasicMaterial map={texture} side={THREE.DoubleSide} />
</mesh>
<OrbitControls />
</Canvas>
Sourcing an Image
Most medical images come in DICOM format, which is a standard that encodes metadata like patient information and information about about the capturing device alongside image data. For my example I downloaded case 1 from here. Using cornerstone.js for loading and parsing the DICOM file you get the following imageFrame
imageFrame : {
largestPixelValue: 1687,
smallestPixelValue: 0
photometricInterpretation: "MONOCHROME2",
pixelData: [...], //Uint16Array(262144)
columns: 512,
rows: 512,
...
}
Generating a Texture
To render the pixelData I use a DataTexture, which is a texture made from raw data. Based on photometricInterpretation being "MONOCHROME2" I infered that the pixelData array was of pixel intensity, instead of say a flattened RGBA image, and using the largest and smallest pixelValue I was able to normalise the pixel instensity to the range of [0,255]. I then mapped the intensity values to the RGB channels.
const channels = 4;
const size = columns * rows;
const buffer = new Uint8Array(size * channels);
pixelData.forEach((pixel: number, index: number) => {
// convert pixel to [0-255] range
const intensity = Math.floor((pixel / largestPixelValue) * 255);
const stride = index * channels;
// Fill in RGB values
for (let channel = 0; channel < channels - 1; channel++) {
buffer[stride + channel] = intensity;
}
// Fill in Alpha Channel
buffer[stride + 3] = 255;
});
const texture = new THREE.DataTexture(buffer, columns, rows);
Attach the texture to a material then add it to a plane mesh and bada-bing bada-boom theres a slice of a colon... I think.
Heres the source code