From Filter Forge Wiki
Filter Forge's Sample-Based Architecture
Filter Forge uses a sample-based approach to image generation. Filter Forge components don't produce images by painting their pixels directly. Instead, Filter Forge 'asks' a component "what color your image has at coordinates x and y?", and the component 'answers' with four RGBA values. In other words, the resulting image is implicitly defined by the component, as opposed to being explicitly painted by it.
To 'ask' about color at a given point of the image, Filter Forge calls the get_sample() function of the component with the point coordinates as arguments, and the function then calculates and returns the final 'answer' as four RGBA values.
During the execution of its own get_sample() function, components may in turn call get_sample() functions of components connected to its inputs, and use the returned RGBA values to calculate its own resulting color. Essentially, the filter tree actually works [to be described later. Arguments of get_sample() – here or below?].
To render a filter into a bitmap, Filter Forge renderer cycles through all its pixels and calls the get_sample() function of the root component of the filter tree at least once per pixel (anti-aliased pixels may require multiple calls, read below for details.) For example, generating a non-antialiased bitmap with the dimensions of 30x30 pixels would require 900 calls to component's get_sample() function.
Executing a call to the get_sample() function of a component is called sampling, the coordinates of the point whose color is being requested are called sample coordinates, and the returned color is often referred to as simply sample.
Sampling and Map Components
Map components (including map scripts) do their work by calling get_sample() functions of components connected to their inputs.
A component that received a get_sample() call from an 'upstream' component it is connected to or from the Filter Forge renderer directly, may in turn call get_sample() of other components that are connected to its inputs. This way, components in the filter tree can obtain results generated by their 'downstream' components and use them to calculate their own results.
Filter Forge components are free to call get_sample() of their map inputs any number of times, with any sample coordinates they want, and they can combine the RGBA values returned by these calls in any manner. For example, distortion components may shoot downstream samples at different coordinates than specified in the upstream call they received. Another example is adjustment components that usually shoot downsteam samples at the same coordinates as those given in an upstream call – they just perform some form of color correction on the returned sample color which they return as result.
Sampling and Bitmap-Based Map Components
Bitmap-based components such as Blur, Sharpen or Median, are something of an exception to the sample-based approach. Their algorithms, by design, require multiple reads of the source image, so it makes sense to cache it in a bitmap. All this is encapsulated using the usual get_sample() function which, in this case, just fetches pixels from the internal bitmap cache.
Sampling in Curve Components
Curve components also use sampling, but instead of returning RGBA colors they return a single numeric value within the 0…1 range, and, in addition to the sample coordinates, arguments of their get_sample() function includes a parameter t which means [this and that].
You may wonder why a curve component needs the coordinates of the sample points. This is because Filter Forge's curve components can have map inputs, and therefore the curve they generate may have different shape at different sample points. For more information on how parameter mapping works for curves, see Map Inputs.
Coordinate System for Sampling
This section explains the sample coordinates, x and y, which are passed to script's get_sample() function or to input-sampling functions such as get_sample_map() as arguments.
A sample point with the coordinates of 0, 0 corresponds to the top-left corner of the image, and a point with the coordinates of 1, 1 corresponds to the bottom-right corner of the square Size x Size pixels wide, where Size is the current value of the global Size slider. When Size is set to its maximum value (which is determined as the minimum between the width and height of the source image) and the image is square, the point of 1, 1 corresponds to the bottom-right corner of the image. - The dependency of sample coordinates on the value of the Size slider may seem counter-intuitive and complicated, but actually it frees script writers from having to deal with Size, image dimensions and aspect ratios manually. Unless you're writing an advanced script which needs to be Size-independent, you don't have to worry about Size or image dimensions – you can just keep the important elements of script's output image in the region between (0, 0) and (1, 1).
For advanced scripting, there are global variables that let you query the value of the Size slider, the width and height of the working image, and the relative width and height of the Seamless Region, which are needed for scripting seamless patterns.
Seamless Region variables and coordinate wrapping.
Anti-aliasing is removal of jagged edges. A stupid approach would be to simply shoot multiple samples for each pixel of the bitmap being rendered, and then average all returned sample colors to produce the final pixel color. However, this approach is a waste of processing cycles if the image is composed mostly of solid-colored or smooth areas, which is often the case. To save rendering time, Filter Forge uses a smart anti-aliasing algorithm – all Filter Forge components can report areas in their output where aliasing is likely to occur, and the Filter Forge renderer applies anti-aliasing to these areas specifically.