This is the Notch node reference.

Notch node types:

Cameras:
Cameras define the point of view, perspective and aspect from which the scene is rendered.

Cloning:
Cloners instantiate multiple copies of geometry nodes – such as 3D Objects, Text Nodes and Shape 3D nodes – that are parented to them.

Deformers:
Deformers modify the vertex positions of geometry nodes – such as 3D Objects, Text Nodes, Shape 3D nodes and particle systems – that the deformer is parented to. The deformation state is reset every frame.

Fields:
Fields discretise 2D or 3D areas of space into texels / voxels and store colour and velocity data for them. They are often used for simulations of volumetric and fluid-like effects.

Generators:
Generators create patterns mathematically in 2D or 3D, either direct to a parent canvas if connected to one (Fields, Render to Texture, Video nodes), or to their own canvas for use as image sources in their own right.

Geometry:
Geometry nodes are used to generate and/or render meshes, animated geometry and skeletal rigs.

Interactive:
Interactive nodes allow external inputs to be used in scenes. Sources include direct inputs such as keyboard and mouse, system inputs like the clock time, and Internet sources like Twitter and RSS feeds.

Lighting:
Lighting nodes control how the scene is lit.

Logic:
The Logic nodes are used to create a meta logic system within Notch.

Materials:
Material nodes control surface appearance and response to the lighting environment of the scene.

Modifiers:
Modifier nodes can be attached to numeric value inputs in order to affect and change them, e.g. to give them a value of a mathematical function or that of an external input source such as a MIDI controller.

Nodes:
This section contains a group of mostly miscellaneous nodes.

Particles:
Particle nodes perform simulation of particle-like effects. A Particle Root node is the root of the particle system; emitters emit particles into the system; affectors control the movement of particles; shading nodes are used to colour and affect them for rendering; and rendering nodes render them in various ways.

Physics:
The nodes in this section allow you to create simple physics systems and dynamic movements for objects in your scene.

Post-FX:
Post FX nodes perform image processing on the node they are parented to if suitable – Video Processing nodes, Fields and Render to Texture. Otherwise, they are considered to work on “everything” – i.e. they are applied to the result of the full scene render. PostFX nodes do not store copies of the image they are processing, they only modify the image stored by the parent.

Procedural:
Procedural nodes are used to generate geometric and volumetric forms implicitly using signed distance fields.

Scripting:
Scripting nodes allow for project behaviour to be scripted using Javascript.

Shading Nodes:
Shading nodes are used to generate shader code used for rendering 3D objects and other types of geometry.

Sound:
These nodes control sound and sound output in Notch.

Video Processing:
Video Processing nodes perform image processing while also storing a new copy of the image. This allows processing chains with multiple branches to be created. They must ultimately be connected to a video processing source node – such as a Video Source or Video In Source – which forms the start of the processing chain.