![[Uncaptioned image]](https://cdn.awesomepapers.org/papers/1527d3b3-cfc9-493d-a9f7-91b74d18e3e7/teaser.jpg)
Some results obtained with our method. Left: distorted images of an accretion disc due to gravitational light bending, with relativistic Doppler and beaming effects. Middle: gravitational lensing creates several amplified images of each punctual star and creates Einstein rings. Right: near the speed of light, light is amplified and blue-shifted ahead, and is reduced and red-shifted behind.
Real-time High-Quality Rendering of Non-Rotating Black Holes
Abstract
We propose a real-time method to render high-quality images of a non-rotating black hole with an accretion disc and background stars. Our method is based on beam tracing, but uses precomputed tables to find the intersections of each curved light beam with the scene in constant time per pixel. It also uses a specific texture filtering scheme to integrate the contribution of the light sources to each beam. Our method is simple to implement and achieves high frame rates.
1 Introduction
Black holes are strange objects which recently got a lot of public exposure with the Interstellar movie [JvTFT15], the detection of gravitational waves from merging black holes[AAA+16], and the first image of a black hole [EHT19]. A real-time, high-quality visualization of a black hole could help the public in getting an intuitive "understanding" of their properties, for instance in planetariums or in 3D astronomy software. It could also be useful in space games. In this context, we propose a real-time high-quality rendering method for non-rotating black holes, with 2 contributions: a precomputation method for constant time beam tracing, and a texture filtering scheme to compute the contribution of the light sources to each beam.
2 Related work
Black hole visualization has a long history starting with [Lum79], and summarized in [Lum19]. Offline rendering methods generally use beam tracing in curved space-time, support rotating black holes and produce very high-quality images [Ham14, Ria14, JvTFT15]. However, they are complex to implement and are not interactive (e.g. an IMAX Interstellar frame requires at least 30 minutes with 10 cores and the renderer has 40kLoC [JvTFT15]). Physically accurate general relativistic magnetohydrodynamics simulations of accretion discs are even more complex and require super-computers [LHT+18].

Our work is more related to interactive visualization methods, usually restricted to non-rotating black holes to reduce the complexity of the problem. [MW10] render about background stars around a black hole, whose apparent positions, colors and intensities change due to the gravitational effects. Each star is rendered with a point primitive, and its projection(s) on screen are found in constant time by using a precomputed lookup table. [MB11] render a torus and a background night sky texture for an observer orbiting a black hole, with a ray-tracing method. Ray intersections with the scene are found in constant time thanks to lookup tables precomputed with a parallelized code (for a fixed orbit radius). [MF12] use ray-tracing to render an accretion disc around a black hole. Ray intersections are found in constant time by using an analytic expression involving the Jacobi-sn function (evaluated with arithmetic-geometric, complex number series).
In comparison, our method uses only two small tables ( and ) which are very fast to precompute. They are used to find, in constant time, the intersection(s) of curved light beams with the accretion disc and millions of background stars (stored in a cubemap with a specific filtering scheme).
3 Model
Our goal is to render a non-rotating black hole with an accretion disc and background stars, illustrating the effects of gravitation on light. Simulating a realistic accretion disc is not a goal: we thus use a basic, infinitely thin disc model instead. However, we want to get real-time and high-quality images, which is not easy:
-
•
a simple ray-marching algorithm can render a sky map texture distorted by a black hole in real-time, but not with a high quality (e.g. stars become curved segments instead of staying punctual),
-
•
conversely, offline beam-tracing methods produce high-quality images but are not real-time [JvTFT15].
To this end, we propose a "precomputed beam tracing" method: for each pixel, we initialize a light beam, compute its intersections with the scene using precomputed tables, and then the light received from the intersected objects. These 3 steps are explained below, after a very short introduction to the Schwarzschild metric.
3.1 Schwarzschild metric
The space-time geometry around a non-rotating black hole can be described with the Schwarzschild metric. In units such that the radius of the black hole’s event horizon and the speed of light are 1, this metric is
(1) |
where is the line element and are the Schwarzschild coordinates [Wei72]. are (pseudo-)spherical coordinates, and in the following we also use the corresponding (pseudo-)Cartesian coordinates , as well as the inverse radius . In particular, the Cartesian coordinates of an orthonormal basis for a static observer at are, respectively (see Fig. 1)
(2) |
3.2 Beam initialization
The first step of our method is to compute, for each pixel, the initial direction of the corresponding light beam. As [MW10], and in order to simplify the next steps, we take advantage of some symmetries to reduce this direction to a single angle , as shown below.
Let be the camera position in Schwarzschild coordinates, and the Lorentz transformation [Wei72] specifying the camera orientation and velocity with respect to a static observer at . An orthonormal basis for the camera is thus (see Fig. 1), given by
(3) |
where is the camera 4-velocity and define its orientation. For a pinhole camera with focal length , and since beams are traced backward, the initial beam direction for a pixel with screen coordinates is (see Fig. 1)
(4) |
where denotes the projection of on the hyperplane.
We now take advantage of the spherical symmetry of the metric, and of the fact that its geodesics are planar [Wei72], to reduce to a single angle. Let be rotated Schwarzschild coordinates such that the beam’s axial ray is contained in the equatorial plane . They can be defined as the (pseudo-)spherical coordinates corresponding to the following new orthonormal basis vectors (for the Euclidean metric – see Fig. 1):
(5) |
In these rotated coordinates the metric (1) keeps the same form and the light beam starts from with an initial angle from the axis (see Fig. 1).
Finally, note that in the plane the accretion disc becomes two line segments at angles and from the axis, with
(6) |
and where the sign is chosen such that (see Fig. 1).
3.3 Beam tracing
The second step of our method is to compute the beam intersections with the scene, and the light emitted there. For this we first need to determine the geodesic followed by the beam’s axial ray.
For light rays , and there exist curvilinear coordinates , defined up to an affine transform, such that and are constant along the ray [Wei72]. We can thus choose such that the second constant is , leading to
(7) |
where happens to be the inverse of the ray’s impact parameter (see Fig. 1). By substituting this in (1) with and we get the geodesic equation
(8) |
Integrating this numerically at each pixel, with a high precision, would be too slow (see Section 5). Alternatively, the analytic solution for , using the Jacobi-sn function (not available on GPU), could be implemented with numerical series [MF12]. However, we also need the retarded time (to animate the accretion disc) and the light ray deflection (for the stars). To compute all this easily and efficiently, we use instead two small precomputed tables. We explain below how we precompute and use these tables to find the beam intersections, thanks to some ray properties that we present first.
3.3.1 Ray properties
Light rays can be divided in 3 types. If is larger than the maximum of over , reached at the photon sphere , then (8) shows that all values of are possible. The light ray thus comes from infinity into the black hole, or vice-versa. Otherwise, some values around are excluded. The ray either stays in the (empty) region , or comes from infinity, reaches an apsis , and goes back to infinity. In the later case is given by setting in (8):
(9) |
and the light ray is unchanged by the reflection (see Fig. 1). In any case, changes a solution of (8) into another, with and changed into their opposite and into .
3.3.2 Precomputations and beam tracing: background stars
For background stars we first compute the beam’s escape angle, and then sum the light emitted by all the stars in the beam’s footprint on the celestial sphere, around this escape direction.
Escape angle
Let be the beam’s escape angle (or if it falls into the black hole), measured from the axis. For efficient rendering, could be precomputed for all initial conditions . But this would yield an algorithm. Instead, we precompute the deflection of rays coming from infinity (see Fig. 1) in a table, for all and or (depending on and taking advantage of the above symmetries). This gives a trivial algorithm (see Algorithm 3.3.1 – we use the Euler method but Runge-Kutta or other methods are possible too). At runtime, we compute as or , depending on the ray direction (see Fig. 1 and Algorithm 3.3.1).
In practice is defined only in a subset of , diverges at and , and varies rapidly around (because rays make more and more turns near the photon sphere before falling or escaping). For good precision we thus map non-linearly into a square domain, designed to get more samples in these regions (see Appendix A).
Emitted light
It now remains to compute the light emitted from all the stars in the beam’s footprint on the celestial sphere, around . For extended sources such as nebulae or galaxies, we can simply take advantage of anisotropic texture filtering, by storing these sources in a cube map. For punctual stars, however, this would yield unrealistically stretched star images. Our solution is to use a manually filtered cube map (see Fig. 2):
-
•
Each texture element (or texel) stores the color and position (in the texel) of at most one star. A color sum (and not average) and luminosity-weighted position average is used for mip-mapping.
-
•
We compute the beam’s footprint in the cube map by using screen space partial derivatives (implemented with finite differences by the rendering pipeline). To avoid discontinuities at cube edges, we compute the partial derivatives and of , and then compute the derivatives of the cube map face texture coordinates analytically from them (e.g. for the +z face , ).
-
•
We compute a mipmap level from the size of the footprint, fetch all the texels at this level in the footprint, and accumulate the colors of the corresponding stars. For anti-aliasing, and to conserve the total intensity, we view each star as a area and multiply its intensity with the area of its intersection with the considered screen pixel (i.e. with , where and is the star’s subpixel coordinates – the pixel domain being ). Note that this requires to consider an extended footprint (see Fig. 2). In our implementation, we select the mipmap level so that it is at most texels.
Note that this method approximates quadrilateral footprints with parallelograms, and does not use interpolation across mipmap levels. This would be easy to fix, but is not really necessary since our method already gives very good results.

3.3.3 Precomputations and beam tracing: accretion disc
As for stars, we compute the beam intersection(s) with the accretion disc by using a precomputed table. We then compute the light emitted there by using a simple procedurally animated disc model.
Intersections
Let and be the inner and outer radius of the disc (with , the innermost stable circular orbit [Las16]). Since the intersections can only occur at (see Fig. 1), we only need the function to check if there exist such that . For this we precompute , for light rays coming from infinity, in a table. At runtime, can then be computed with , where is the camera position (which can be obtained from the deflection since – see Fig. 1).
Note that we don’t need for all : we can stop when since no intersection can occur between this point and the apsis, if any (and the rest can be deduced by symmetry). In practice, this means that we only need for , which has two consequences:
-
•
we don’t need to evaluate for all : in fact we only need ,
-
•
there can be at most two intersections: one on each symmetric part of the ray (if it does not fall into the black hole).
The algorithms to precompute and to find the accretion disc intersections , follow from these properties, and those of Section 3.3.1, and are shown in Algorithm 3.3.1. As for , we map ’s domain non-linearly into to get good precision in large gradient areas (see Appendix A).
Finally, note that for an animated disc we also need to compute the retarded time between the intersections and the camera. For this we also precompute and store – using , from (7) – in and . This allows the computation of the retarted times , at the disc intersections, as shown in Algorithm 3.3.1.

Emitted light
It now remains to compute the light emitted by the accretion disc at the intersection points. For this we use the light emitted by a black body at temperature , times the disc density. We use [Las16], and compute the density with a sum of procedural particles moving along quasi-circular precessing orbits (see Fig. 3 and Appendix B).
3.4 Shading
Due to gravitational effects, the light received at the camera is different from the emitted light, computed above. We present these effects below, and explain how we compute them.
3.4.1 Gravitational lensing effects
Due to gravitational lensing, the light emitted by a punctual star is received amplified by a factor , where (resp. ) is the beam’s solid angle at the camera (resp. emitter) [VE99]. We compute it by using the screen space partial derivatives of the beam directions at the camera and at the emitter:
(10) |
where is the normalized vector. Note that this does not apply to area light sources, because the beam’s subtended area varies in inverse proportion.
3.4.2 Doppler and beaming effects
Due to gravitational and relativistic time dilation and length contraction effects, the frequency of the received light differs from the emitted frequency . The ratio is given by [PHL17]
(11) |
where is the 4-velocity of the receiver, is the tangent 4-vector , at the receiver, of the light ray curve , and and are the corresponding emitter quantities. We thus need , , , and , that we compute as follows.
In rotated Schwarzschild coordinates, and are given by . Using (7), this gives
(12) |
where is the negative root of (8) – for the actual light rays is negative, unlike in the previous section where rays were traced backward. In non-rotated Schwarzschild coordinates, we have
(13) |
for the accretion disc (if we assume a circular motion) [PHL17], for static stars, and for the camera. Finally, to compute and , we need the corresponding rotated coordinates: and are unchanged, is not needed since and (and similarly for ). For instance, for the accretion disc, we get and
(14) |
The above Doppler effect has an associated beaming effect: the received intensity differs from the emitted one because, from Liouville’s theorem, is invariant [MTW73]. In terms of wavelength , and with , this gives . For black bodies the two effects result in a temperature shift . For other light sources however, that we want to support, the result is more complex. We thus precompute it in a 3D texture , for each chromaticity and Doppler factor .
To this end we need to choose a spectrum for each chromaticity, among the infinite number of possible spectrums. For simplicity and to get black body spectrums for black body colors, we use spectrums of the form , where is the black body spectrum for temperature , is the correlated color temperature, and and are two fixed absorption spectrums. A linear system gives and from the chromaticity, and the Doppler and beaming effects give CIE XYZ colors that we precompute with
(15) |
where , and are the CIE color matching functions. At runtime, the emitted XYZ color computed in Section 3.3 is transformed into the received color with (11).
3.4.3 Lens glare effects
Due to light scattering and diffraction inside the eye, haloes appear around very bright light sources, which would otherwise be hard to distinguish from fainter sources. For this reason we apply a bloom shader effect on the final image, before tone-mapping. We use a series of small support filter kernels on mipmaps of the full image, approximating a point spread function from [SIS+95], but more precise methods are possible too [HESL11].
4 Implementation

We implemented our method in C++ for the precomputations, and WebGL 2 for the rendering. The full source code and an online demo are available at https://github.com/ebruneton/black_hole_shader. The demo simulates a static or freely falling observer (see Fig. 4) and allows the user to set various parameters (black hole mass, disc temperature and density, camera orbit, etc).
We precompute and in and RG32F textures, respectively. Precompute takes about 11 seconds on a GHz Intel Core i5-6500 CPU, with , and unit tests show that TraceRay results , and are within of reference values computed without intermediate textures. We precompute in a RGB32F texture, in about 7 seconds.
We also precompute two RGB9E5 cubemaps for the area and punctual light sources, from the Gaia DR2 [Gai18] and Tycho 2 [HFM+00] star catalogs. This requires downloading and processing 550GB of compressed data, which can take a day, and yields million punctual light sources. In our implementation we don’t store a sub-texel position for each star: instead, we use a hash of its color to compute a pseudo-random position.
5 Results and discussion
Some results obtained with our method are shown in Fig. Real-time High-Quality Rendering of Non-Rotating Black Holes. They are rendered in Full HD at about fps on an NVidia GeForce GTX 960 (see Table 1).
The benefit of our precomputed tables can be measured by replacing TraceRay with a ray-marching method integrating (8) numerically (and keeping everything else unchanged). To get the same performance as with the precomputed tables, only integration steps at most can be used, and stars end up at several degrees from their correct positions. To get (almost) the same precision, up to 1000 integration steps must be used, and the framerate drops to about fps (see method M3 in Table 1).
The benefit of our custom texture filtering method to render the stars can be measured by replacing it with the method from [MW10] (and keeping everything else unchanged). Each of the million stars is then rendered with anti-aliased point primitives (we used in our tests). The point position is computed with a lookup in a precomputed texture. This gives about fps (see method M2 in Table 1). The main bottleneck is due to the fact that, inside the region of Einstein rings, many stars project into the same pixel, leading to a lot of overdraw.
View | M1 | M2 | M3 | Stars | Disc | Bloom |
---|---|---|---|---|---|---|
1 | 150 | 64 | 43 | 2.99 | 0.82 | 1.11 |
2 | 160 | 59 | 45 | 2.64 | 0.78 | 1.15 |
3 | 153 | 67 | 45 | 2.99 | 0.69 | 1.10 |
4 | 114 | 64 | 41 | 4.14 | 1.87 | 1.08 |
A limitation of our method is that views from inside the horizon are not supported. Indeed, since no observer can remain static in this region, we can no longer specify the camera and initialize light beams by using a reference static observer. Also the Schwarzschild metric diverges at the horizon. However, by using different coordinates, as in [MW10], we believe that our method can be extended to support this case.
Another limitation is that motion blur, which is necessary for very high quality animations, is not supported. Also, because of the approximations in our custom texture filtering method, in some cases a few stars flicker when the camera moves. Fixing both quality issues might be easier by extending [MW10] rather than by extending our method, at the price of decreased performance.
Finally, another limitation of our method, and of [MW10] as well, is that rotating black holes are not supported. Because they are only axially symmetric, the "inverse ray tracing" approach of [MW10] would probably be hard to generalize to this case. Our precomputed ray tracing method, on the other hand, could in principle be generalized to 4D tables containing the deflected direction and the accretion disc intersections of any ray (specified with 2 position and 2 direction parameters). In practice however, obtaining precise 4D tables of reasonable size might be hard.
6 Conclusion
We have presented a beam tracing method relying on small precomputed textures to render real-time high-quality images of non rotating black holes. Our method is simple to implement and achieves high frame rates. Extending it to views from inside the horizon, and to rotating black holes, if this is possible, is left as future work.
Acknowledgments
We would like to thank Alain Riazuelo for proofreading this paper.
Appendix A Texture mappings
We store at texel coordinates
where is the sign of , and at texel coordinates
as explained at https://ebruneton.github.io/black_hole_shader/black_hole/functions.glsl.html.
Appendix B Disc particles
The orbit of a point particle in the accretion disc is given by
where is the Jacobi-sn function, , and [Dar59]. For quasi-circular orbits this can be approximated with
(16) |
where since, for circular orbits, (13) gives . For a linear particle parameterized by , the position of a point is obtained by replacing with in (16). Thus, given a ray hit point , we compute the parameter of the "nearest" particle point with . We then compute the "distance" between and the linear particle center (at ) with . We finally compute the particle density at with a smoothly decreasing function of .
Appendix C Camera orbit
Position
The camera position is specified by its polar coordinates in an orbital plane with inclination (see Fig. 4). In Schwarzschild coordinates adapted to this orbital plane the camera 4-velocity is , where and are two constants of motion. Substituting this in (1) gives
We use these relations to update the coordinates at each proper time step . The corresponding Cartesian coordinates are
from which we deduce the Schwarzschild coordinates , , and .
The above relations require the constants of motion and . We compute them from the initial position, direction and speed, noted , , and (see Fig. 4). We get from the Lorentz factor , where is the 4-velocity of a static observer. Finally, using and the above equations, we get .
Lorentz transform
We compute the Lorentz transform from the static observer basis to the camera basis by using the intermediate orthonormal basis where is the orbital plane’s normal (see Fig. 4), as follows.
Let be the rotation matrix from to : , , . Its lower right block is
In the basis the camera 4-velocity and speed are:
A reference frame for the camera is thus , , where is a Lorentz boost: , [Wei72].
Finally, let be a user specified rotation matrix. We then compute with . Note that this procedure assumes that the camera orientation is actively controlled, i.e. is not freely evolving as a gyroscope would be.
References
- [AAA+16] B.P Abbott, R. Abbott, Thomas Abbott, Matthew Abernathy, Fausto Acernese, K. Ackley, C. Adams, Teneisha Adams, Paolo Addesso, R.X Adhikari, Vaishali Adya, C. Affeldt, M. Agathos, Kazuhiro Agatsuma, Nishu Aggarwal, Odylio Aguiar, L. Aiello, Anirban Ain, P. Ajith, and John Zweizig. Observation of gravitational waves from a binary black hole merger. Physical Review Letters, 116, 2 2016.
- [Dar59] Charles Galton Darwin. The gravity field of a particle. Proceedings of the Royal Society of London. Series A. Mathematical and Physical Sciences, 249(1257):180–194, 1959.
- [EHT19] EHT Collaboration et al. First M87 event horizon telescope results. I. The shadow of the supermassive black hole. The Astrophysical Journal Letters, 875:1, 2019.
- [Gai18] Gaia Collaboration et al. Gaia Data Release 2. Summary of the contents and survey properties. Astronomy and Astrophysics, 616:A1, 8 2018.
- [Ham14] Andrew Hamilton. Black hole flight simulator. 2014.
- [HESL11] Matthias B. Hullin, Elmar Eisemann, Hans-Peter Seidel, and Sungkil Lee. Physically-based real-time lens flare rendering. ACM Trans. Graph. (Proc. SIGGRAPH 2011), 30(4):108:1–108:9, 2011.
- [HFM+00] E. Høg, C. Fabricius, V. V. Makarov, S. Urban, T. Corbin, G. Wycoff, U. Bastian, P. Schwekendiek, and A. Wicenec. The Tycho-2 catalogue of the 2.5 million brightest stars. Astronomy and Astrophysics, 355:L27–L30, March 2000.
- [JvTFT15] Oliver James, Eugénie von Tunzelmann, Paul Franklin, and Kip S Thorne. Gravitational lensing by spinning black holes in astrophysics, and in the movie Interstellar. Classical and Quantum Gravity, 32(6):065001, 2 2015.
- [Las16] Jean-Pierre Lasota. Black hole accretion discs. In Cosimo Bambi, editor, Astrophysics of Black Holes: From Fundamental Aspects to Latest Developments, pages 1–60. Springer Berlin Heidelberg, 2016.
- [LHT+18] M. Liska, C. Hesp, A. Tchekhovskoy, A. Ingram, M. van der Klis, and S. Markoff. Formation of precessing jets by tilted black hole discs in 3D general relativistic MHD simulations. Monthly Notices of the Royal Astronomical Society, 474(1):L81–L85, 2 2018.
- [Lum79] J. P. Luminet. Image of a spherical black hole with thin accretion disk. Astronomy and Astrophysics, 75:228–235, 5 1979.
- [Lum19] Jean-Pierre Luminet. An illustrated history of black hole imaging : personal recollections (1972-2002). 3 2019.
- [MB11] Thomas Müller and Sebastian Boblest. Visualizing circular motion around a Schwarzschild black hole. American Journal of Physics, 79:63–73, 1 2011.
- [MF12] Thomas Müller and Jörg Frauendiener. Interactive visualization of a thin disc around a Schwarzschild black hole. European Journal of Physics, 33(4):955–963, 5 2012.
- [MTW73] C. W. Misner, K. S. Thorne, and J. A. Wheeler. Gravitation. 1973.
- [MW10] Thomas Müller and Daniel Weiskopf. Distortion of the stellar sky by a Schwarzschild black hole. American Journal of Physics, 78:204–214, 1 2010.
- [PHL17] Dennis Philipp, Eva Hackmann, and Claus Laemmerzahl. Redshift and frequency comparison in Schwarzschild spacetime. 11 2017.
- [Ria14] Alain Riazuelo. Simulation of starlight lensed by a camera orbiting a Schwarzschild black hole. 2014.
- [SIS+95] Greg Spencer, Taligent Inc, Peter Shirley, Kurt Zimmerman, and Donald Greenberg. Physically-based glare effects for digital images. Computer Graphics (Proc. SIGGRAPH 1995), 29:325–334, 08 1995.
- [VE99] Kumar Virbhadra and George Ellis. Schwarzschild black hole lensing. Physical Review D, 62, 04 1999.
- [Wei72] S. Weinberg. Gravitation and cosmology: principles and applications of the general theory of relativity. Wiley, 1972.