A real-time Minecraft terrain visualizer for augmented reality sandboxes using Xbox Kinect
video: https://github.com/user-attachments/assets/ce1f2003-3ed6-4056-a1da-17e1319ca550
git clone https://github.com/colemaring/MC-AR-Sandbox.git or download and extract zip
Run npm init && npm i
Download buildtools https://www.spigotmc.org/wiki/buildtools/ to compile the spigot server jar
place the server jar in the MC-AR-Sandbox folder
Run the spigot server jar once to initialize the server and accept the eula
Move SpigotBlankPlugin.jar into the plugins folder
Run launch.bat
Join the minecraft server using localhost
The blaze rod (gold stick) is used to manually update the terrain. This is useful if autoupdate is off and you want to update the terrain.
The other blocks in the inventory represet different biomes that you can choose from. When you select a biome, all subsequent terrain updates will reflect that chosen biome. You can go back by selecting a different biome
If the terrain is bugged or doesnt match the sandbox at all, left click a biome block to update the terrain back to normal.
/waterlevel - allows you to choose the water level of the world. eg. /waterlevel 10
/autoupdate - enable or disable auto update. When this is enabled the terrain will be automatically update to match the kinect sensor data. Uses the timer variable. eg. /autoupdate on
/timer - used to define the amount of seconds in between each auto terrain update. Only used when autoupdate is set to on.
/default - resets the variables to the default values
All settings persist across server start and stop. You can reset the setting with /default
The default settings are as follows:
waterlevel = 10
autoupdate = false
timer = 1
biome = "mountains"
In topoprojection.py, experiment with the aspect_ratio and smoothing values to match the projection to the sand
This program uses the Kinect SDK to take depth data from an Xbox Kinect Sensor. That data is then parsed and scaled down to be rendered real-time in a minecraft server.
I am using the DepthFrame class from the Kinect SDK, morso the Node version of the sdk: https://github.com/wouterverweirder/kinect2, and writing this data to a txt file many times per second. (Not the most elegant solution). This output file is then read asynchronously in the plugin java file on a seperate thread from the Bukkit thread. The Bukkit API is what I am using to place, remove, and change blocks in the minecraft world. Bukkit API operations like setting and removing blocks are very expensive, so many optimizations were made to reduce the amount of blocks changes. My solution is to keep track of values that have changed from the last DepthFrame object. I stored those changes in a ConcurrentHashMap and iterated through the columns that have changes and only touched the blocks that need to be modified.
topoprojeciton.py is very straightforward. I am reading in that same output.txt file every x ms and displaying the data using matplotlib. The value of the height will determine the color of that point. I'm using a median filter to smooth out the edges of the different topographical levels. This smoothing value can be adjusted at the top of the file. The projector will be projecting the plot displayed by this program. The amount of topographical levels, the distance between each level, and the color of that level can all be changed.
Example with smoothing 10 and the following color space and level distance of 25mm:
cmap = plt.get_cmap('gist_rainbow')
levels = np.linspace(2300, 2575, 25)
colors = plt.cm.viridis(np.linspace(0, 1, len(levels)))
![image1](https://github.com/user-attachments/assets/6df0f985-8911-4ad4-a94b-05c6f641a71e)
Example with smoothing 1
![image2](https://github.com/user-attachments/assets/d051908b-e35e-4a3a-b9ec-27b9201285ef)