Audio engine overhaul
All checks were successful
studiorailgun/Renderer/pipeline/head This commit looks good

This commit is contained in:
austin 2024-03-09 15:54:47 -05:00
parent 7bf26fb47a
commit 7ab6cf96d9
49 changed files with 1295 additions and 208 deletions

12
.vscode/launch.json vendored
View File

@ -15,6 +15,18 @@
"name": "Launch Main",
"request": "launch",
"mainClass": "electrosphere.engine.Main",
"vmArgs": "-Xmx1G -Xms100m -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=\"/tmp\"",
"projectName": "Renderer"
},
{
"type": "java",
"name": "Launch Main (Debug Audio)",
"request": "launch",
"mainClass": "electrosphere.engine.Main",
"env": {
"ALSOFT_LOGLEVEL": 4,
},
"vmArgs": "-Xmx1G -Xms100m -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=\"/tmp\"",
"projectName": "Renderer"
},
{

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

View File

@ -85,6 +85,13 @@
"offsetZ" : 0
}
},
"ambientAudio": {
"responseWindAudioFilePath": "Audio/ambienceWind1SeamlessMono.ogg",
"responseWindLoops": true,
"randomizeOffset": true,
"gainMultiplier": 0.9,
"emitterSpatialOffset": [0,3,0]
},
"modelPath" : "Models/proceduralTree2/proceduralTree2v2.fbx"
}

View File

@ -1,3 +1,3 @@
#maven.buildNumber.plugin properties file
#Wed Mar 06 19:14:23 EST 2024
buildNumber=31
#Sat Mar 09 15:31:50 EST 2024
buildNumber=34

View File

@ -5,6 +5,7 @@
- @subpage fluidsimindex
- @subpage worldstorageindex
- @subpage uiarch
- @subpage audioengine
# What is this section

View File

@ -1,3 +1,91 @@
@page audioengine Audio Engine
[TOC]
## High Level Overview
The main class that you should interact with directly is the VirtualAudioSourceManager.
It is what creates virtual audio sources and those are what you should be working with directly.
Under the hood, the virtual audio sources are dynamically mapped to real audio sources by the VirtualAudioSourceManager.
The real audio sources are what are actually playing audio to the user.
![](/docs/src/images/archaudio/virtualAudioSourceManagerArch.png)
## Major Usage Notes
- OpenAL will not play stereo audio spatially. It must be converted to mono before openal will follow location.
## Main Classes
[VirtualAudioSourceManager.java](@ref #electrosphere.audio.VirtualAudioSourceManager) - The manager of all virtual audio sources
[VirtualAudioSource.java](@ref #electrosphere.audio.VirtualAudioSource) - An audio source being tracked in data by the engine
[AudioEngine.java](@ref #electrosphere.audio.AudioEngine) - A manager class for direct openal calls from things like main loop and renderer
[AudioSource.java](@ref #electrosphere.audio.AudioSource) - A wrapper around an openAL audio source
[AudioListener.java](@ref #electrosphere.audio.AudioListener) - A wrapper around the openAL listener
[AudioBuffer.java](@ref #electrosphere.audio.AudioBuffer) - A wrapper around an openAL buffer
[AudioUtils.java](@ref #electrosphere.audio.AudioUtils) - Utility functions for creating audio sources (protected to just this package)
[ClientAmbientAudioTree.java](@ref #electrosphere.entity.state.ambientaudio.ClientAmbientAudioTree) - A client-side behavior tree for emitting audio ambiently from an entity
[AmbientAudio.java](@ref #electrosphere.game.data.foliage.type.AmbientAudio) - Ambient audio data about a given type of entity from the entity description in data
## Library Explanation
#### OpenAL Context Creation
OpenAL supports numerous extensions that add things like effects, HRTF, etc.
These extensions must be manually enabled when the openAL context creation is done.
This should happen in [AudioEngine.java](@ref #electrosphere.audio.AudioEngine) when it is initializing openAL.
## Code Organization and Best Practices
#### Startup
TODO
#### Usage
Creating a virtual audio source -- this is analogous to saying "Play this audio here"
```
Globals.assetManager.addAudioPathToQueue(ambientAudio.getResponseWindAudioFilePath());
VirtualAudioSoruce virtualAudioSource = Globals.virtualAudioSourceManager.createVirtualAudioSource(
ambientAudio.getResponseWindAudioFilePath(),
VirtualAudioSourceType.ENVIRONMENT_LONG,
ambientAudio.getResponseWindLoops(),
new Vector3d(0,0,0)
);
virtualAudioSource.setGain(ambientAudio.getGainMultiplier());
```
Creating a ui audio effect
```
Globals.virtualAudioSourceManager.createVirtualAudioSource("/Audio/openMenu.ogg", VirtualAudioSourceType.UI, false);
```
## Terminology
- Virtual Audio Source - An entity or empty source of audio in the game space that may or may not actually be emitting audio to the user.
- Real Audio Source - An OpenAL audio source that is definitely emitting audio to the user.
- Virtual Audio Source Manager - The manager of all virtual audio sources that handles creating, queueing, and destroying audio sources.
- HRTF - Head-related transfer function, fancy math to make the audio sound better by modeling how it enters your ear
## Known Bugs To Fix
- ClientAmbientAudioTree does not destroy itself if the audio does not loop and closes
- AudioBuffer can hard crash if no file is found
## Future Goals

View File

@ -1,3 +1,5 @@
@page creaturemechanicsideas Mechanics Ideas
If/when fluidsim is a part of the engine, have a creature that constantly keeps a bubble of water around itself (drawing from nearby sources as it approaches them). When it runs up to regular creatures to try to attack them, it will force them into swimming state or outright kill them.
Scary creatures that, when entering a chunk, gradually reduce environment volume to zero (ie crickets stop chirping because it's so scary) Like a demon scarecrow or devil

Binary file not shown.

After

Width:  |  Height:  |  Size: 33 KiB

View File

@ -128,6 +128,11 @@ Implement proper Frustum Culling
(03/06/2024)
Bake in imgui
(03/07/2024)
Server frametime bar graph
@ -140,14 +145,19 @@ Bake in imgui
# TODO
Ability to attach ambient audio emitters to entities
Server frametime bar graph
Environment background noise manager - does what it says on the tin
Timekeeping class that defaults to gltf time and falls back to systemCurrentTimeMillis
Methods for sleeping physics bodies if nothing nearby them is dynamic (ie trees if there are no moving creatures near them)
- SAP2 space from ode4j specifically
De-dupe render calls via doing mutations in render pipeline status and dont call setting variables to values they are already set to
Overhaul mesh class
- remove unused stuff
- private constructor
(this is going to require changing a lot of dependencies)
Build a lod system
- Could potentially be held at actor level
@ -167,11 +177,6 @@ Ray Traced Audio Engine
Documentation Pass on as many files as you can stomache
Overhaul mesh class
- remove unused stuff
- private constructor
(this is going to require changing a lot of dependencies)
Generate Tree Entities
- Generate stem
- Ability to specify central stem

View File

@ -3,114 +3,135 @@ package electrosphere.audio;
import electrosphere.logger.LoggerInterface;
import electrosphere.util.FileUtils;
import java.io.IOException;
import java.io.InputStream;
import java.nio.ByteBuffer;
import java.nio.IntBuffer;
import java.nio.ShortBuffer;
import java.nio.channels.Channels;
import java.nio.channels.ReadableByteChannel;
import java.nio.channels.SeekableByteChannel;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import org.lwjgl.BufferUtils;
import static org.lwjgl.BufferUtils.createByteBuffer;
import static org.lwjgl.openal.AL10.*;
import static org.lwjgl.stb.STBVorbis.*;
import org.lwjgl.stb.STBVorbis;
import org.lwjgl.stb.STBVorbisInfo;
import org.lwjgl.system.MemoryStack;
import org.lwjgl.system.MemoryUtil;
import static org.lwjgl.system.MemoryUtil.NULL;
/**
* An audio buffer
*/
public class AudioBuffer {
//the id of the buffer
private int bufferId;
//buffer containing vorbis metadata
private ByteBuffer vorbis = null;
//The main audio data
private ShortBuffer pcm = null;
public AudioBuffer(String fileName) {
//the length of the audio source in milliseconds
float length = 0;
/**
* Creates the audio buffer object
* @param fileNameRaw The path for the audio file
*/
public AudioBuffer(String fileNameRaw) {
String fileNameSanitized = FileUtils.sanitizeFilePath(fileNameRaw);
bufferId = alGenBuffers();
//create buffer to store vorbis data
try (STBVorbisInfo info = STBVorbisInfo.malloc()) {
ShortBuffer pcm = readVorbis(fileName, 32 * 1024, info);
//read the vorbis data as well as main audio data
ShortBuffer pcm = readVorbis(fileNameSanitized, 32 * 1024, info);
// Copy to buffer
alBufferData(bufferId, info.channels() == 1 ? AL_FORMAT_MONO16 : AL_FORMAT_STEREO16, pcm, info.sample_rate());
} catch (Exception e){
LoggerInterface.loggerEngine.ERROR("Failed to load audio", e);
// e.printStackTrace();
LoggerInterface.loggerAudio.ERROR("Failed to load audio", e);
}
LoggerInterface.loggerAudio.DEBUG("Created audio buffer(" + fileNameRaw + ") with length " + length);
}
private ShortBuffer readVorbis(String resource, int bufferSize, STBVorbisInfo info) throws Exception {
/**
* Reads vorbis data. Constructs a buffer that contains vorbis metadata as well as reading the main audio data into a buffer that is returned.
* @param filepath The filepath to read
* @param bufferSize The size of the buffer
* @param info The vorbis info buffer to be filled
* @return The main audio data buffer
* @throws Exception Throws an exception if the decoder fails
*/
private ShortBuffer readVorbis(String filepath, int bufferSize, STBVorbisInfo info) throws Exception {
try (MemoryStack stack = MemoryStack.stackPush()) {
vorbis = AudioBuffer.ioResourceToByteBuffer(resource, bufferSize);
//read the vorbis data from disk
vorbis = AudioBuffer.readFilepathToByteBuffer(filepath);
//decode the vorbis data
IntBuffer error = stack.mallocInt(1);
long decoder = stb_vorbis_open_memory(vorbis, error, null);
long decoder = STBVorbis.stb_vorbis_open_memory(vorbis, error, null);
if (decoder == NULL) {
throw new RuntimeException("Failed to open Ogg Vorbis file. Error: " + error.get(0));
}
stb_vorbis_get_info(decoder, info);
//creates the vorbis metadata object and grabs information about the audio file
STBVorbis.stb_vorbis_get_info(decoder, info);
int channels = info.channels();
int lengthSamples = STBVorbis.stb_vorbis_stream_length_in_samples(decoder);
this.length = STBVorbis.stb_vorbis_stream_length_in_seconds(decoder) * 10;
int lengthSamples = stb_vorbis_stream_length_in_samples(decoder);
//reads the main audio data
pcm = MemoryUtil.memAllocShort(lengthSamples);
pcm.limit(STBVorbis.stb_vorbis_get_samples_short_interleaved(decoder, channels, pcm) * channels);
pcm.limit(stb_vorbis_get_samples_short_interleaved(decoder, channels, pcm) * channels);
stb_vorbis_close(decoder);
//close decoder and return
STBVorbis.stb_vorbis_close(decoder);
return pcm;
}
}
public static ByteBuffer ioResourceToByteBuffer(String resource, int bufferSize) throws IOException {
ByteBuffer buffer;
/**
* Reads the filepath into a byte buffer
* @param filepath The filepath
* @return The byte buffer if the file was read successfully, null otherwise
* @throws IOException Thrown if the file was not readable
*/
private static ByteBuffer readFilepathToByteBuffer(String filepath) throws IOException {
ByteBuffer buffer = null;
Path path = Paths.get(resource);
if (Files.isReadable(path)) {
try (SeekableByteChannel fc = Files.newByteChannel(path)) {
Path path = FileUtils.getAssetFile(filepath).toPath();
if(Files.isReadable(path)){
try(SeekableByteChannel fc = Files.newByteChannel(path)){
buffer = BufferUtils.createByteBuffer((int) fc.size() + 1);
while (fc.read(buffer) != -1) ;
while(fc.read(buffer) != -1){
}
buffer.flip();
}
} else {
try (
InputStream source = FileUtils.getAssetFileAsStream(resource);
ReadableByteChannel rbc = Channels.newChannel(source)
) {
buffer = createByteBuffer(bufferSize);
while (true) {
int bytes = rbc.read(buffer);
if (bytes == -1) {
break;
}
if (buffer.remaining() == 0) {
int capacity = buffer.capacity();
ByteBuffer newBuffer = createByteBuffer(capacity * 2);
for(int i = 0; i < capacity; i++){
newBuffer.put(buffer.get());
}
buffer = newBuffer;
}
}
}
LoggerInterface.loggerFileIO.ERROR("Failed to create audio, file is not readable: " + filepath, new IOException("File access error!"));
}
buffer.flip();
return buffer;
}
/**
* Gets the id of this buffer
* @return The id
*/
public int getBufferId() {
return this.bufferId;
return bufferId;
}
/**
* Cleans up this audio buffer
*/
public void cleanup() {
alDeleteBuffers(this.bufferId);
alDeleteBuffers(bufferId);
}
}

View File

@ -1,41 +1,61 @@
package electrosphere.audio;
import electrosphere.engine.Globals;
import electrosphere.entity.types.camera.CameraEntityUtils;
import electrosphere.logger.LoggerInterface;
import java.nio.ByteBuffer;
import java.nio.IntBuffer;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import org.joml.Vector3f;
import org.lwjgl.BufferUtils;
import org.lwjgl.openal.AL;
import org.lwjgl.openal.ALC;
//import org.lwjgl.openal.*;
import org.lwjgl.openal.ALC10;
import org.lwjgl.openal.ALC11;
import org.lwjgl.openal.ALCCapabilities;
import org.lwjgl.openal.SOFTHRTF;
import org.lwjgl.system.MemoryUtil;
import static org.lwjgl.openal.ALC10.alcDestroyContext;
import static org.lwjgl.openal.ALC10.alcCloseDevice;
import static org.lwjgl.system.MemoryUtil.NULL;
/**
* Main class that handles audio processing
*/
public class AudioEngine {
//openal device
private long device;
//openal context
private long context;
//the listener data for the audio landscape
private AudioListener listener;
private final List<AudioBuffer> soundBufferList;
private final Map<String, AudioSource> soundSourceMap;
//the current gain level of the engine
private float engineGain = 1.0f;
//The current device
String currentDevice = "";
//the default device
String defaultDevice = "";
//if true, hrtf present and active
boolean hasHRTF = false;
//if true, efx present and active
boolean hasEFX = false;
/**
* Creates an audio engine
*/
public AudioEngine() {
soundBufferList = new ArrayList<AudioBuffer>();
soundSourceMap = new HashMap<String,AudioSource>();
}
/**
* Initializes the audio engine
*/
public void init() {
try {
initDevice();
@ -45,33 +65,153 @@ public class AudioEngine {
listener = new AudioListener();
}
void initDevice() throws Exception{
/**
* Lists all available devices
*/
public void listAllDevices(){
currentDevice = ALC11.alcGetString(NULL,ALC11.ALC_ALL_DEVICES_SPECIFIER);
LoggerInterface.loggerAudio.INFO("AL device: " + currentDevice);
defaultDevice = ALC11.alcGetString(NULL,ALC11.ALC_DEFAULT_ALL_DEVICES_SPECIFIER);
LoggerInterface.loggerAudio.INFO("AL default device: " + defaultDevice);
}
/**
* Initializes audio devices
* @throws Exception Thrown if there are no audio devices or fails to create openal context
*/
void initDevice() throws Exception {
//create device
LoggerInterface.loggerAudio.DEBUG("Open ALC device");
this.device = ALC10.alcOpenDevice((ByteBuffer) null);
if (device == NULL) {
throw new IllegalStateException("Failed to open the default OpenAL device.");
}
//create capabilities
LoggerInterface.loggerAudio.DEBUG("Create device capabilities");
ALCCapabilities deviceCaps = ALC.createCapabilities(device);
this.context = ALC10.alcCreateContext(device, (IntBuffer) null);
//create context
LoggerInterface.loggerAudio.DEBUG("Create context");
IntBuffer attrBuffer = getContextAttrs(deviceCaps);
this.context = ALC10.alcCreateContext(device, attrBuffer);
MemoryUtil.memFree(attrBuffer);
if (context == NULL) {
throw new IllegalStateException("Failed to create OpenAL context.");
}
LoggerInterface.loggerAudio.DEBUG("Make Context Current");
ALC10.alcMakeContextCurrent(context);
AL.createCapabilities(deviceCaps);
}
public void setGain(float gain){
engineGain = gain;
/**
* Gets the attrs buffer for creating a context
* @param deviceCaps The device capabilities
* @return The buffer (may be null if no desired extensions present)
*/
IntBuffer getContextAttrs(ALCCapabilities deviceCaps){
int bufferSize = 0;
//check for available extensions
if(deviceCaps.ALC_EXT_EFX){
LoggerInterface.loggerAudio.INFO("EFX PRESENT");
hasEFX = true;
} else {
LoggerInterface.loggerAudio.INFO("EFX NOT PRESENT");
}
if(deviceCaps.ALC_SOFT_HRTF){
LoggerInterface.loggerAudio.INFO("SOFT HRTF PRESENT");
hasHRTF = true;
bufferSize++;
} else {
LoggerInterface.loggerAudio.INFO("SOFT HRTF NOT PRESENT");
}
IntBuffer rVal = null;
//construct buffer if any were found
if(bufferSize > 0 ){
rVal = BufferUtils.createIntBuffer(bufferSize * 2 + 1);
if(deviceCaps.ALC_SOFT_HRTF){
rVal.put(SOFTHRTF.ALC_HRTF_SOFT);
rVal.put(ALC11.ALC_TRUE);
}
rVal.put(0);
rVal.flip();
}
LoggerInterface.loggerAudio.INFO("Create attributes with size: " + bufferSize);
return rVal;
}
public float getGain(){
return engineGain;
/**
* Updates the orientation of the listener based on the global player camera
*/
private void updateListener(){
Vector3f cameraPos = CameraEntityUtils.getCameraCenter(Globals.playerCamera);
Vector3f cameraEye = new Vector3f(CameraEntityUtils.getCameraEye(Globals.playerCamera)).mul(-1);
Vector3f cameraUp = new Vector3f(0,1,0);
listener.setPosition(cameraPos);
listener.setOrientation(cameraEye, cameraUp);
}
/**
* Updates the audio engine
*/
public void update(){
updateListener();
}
/**
* Shuts down the engine
*/
public void shutdown(){
alcDestroyContext(context);
alcCloseDevice(device);
}
/**
* Sets the gain of the engine
* @param gain The gain value
*/
public void setGain(float gain){
engineGain = gain;
}
/**
* Gets the gain of the engine
* @return The gain value
*/
public float getGain(){
return engineGain;
}
/**
* Gets the current openal device
* @return The current openal device
*/
public String getDevice(){
return currentDevice;
}
/**
* Gets the default openal device
* @return the default openal device
*/
public String getDefaultDevice(){
return defaultDevice;
}
/**
* Gets the HRTF status
* @return The HRTF status
*/
public boolean getHRTFStatus(){
return hasHRTF;
}
/**
* Gets the listener for the audio engine
* @return the listener
*/
public AudioListener getListener(){
return listener;
}
}

View File

@ -1,36 +1,66 @@
package electrosphere.audio;
import org.joml.Vector3d;
import org.joml.Vector3f;
import static org.lwjgl.openal.AL10.*;
/**
*
* @author amaterasu
* Encapsulates the listening position in the audio engine
*/
public class AudioListener {
Vector3f position;
//The position of the listener
Vector3d position;
public AudioListener() {
this(new Vector3f(0, 0, 0));
//eye vector for listener
Vector3f eye = new Vector3f(1,0,0);
//up vector for listener
Vector3f up = new Vector3f(0,1,0);
/**
* Constructor
*/
protected AudioListener() {
this(new Vector3d(0, 0, 0));
}
public AudioListener(Vector3f position) {
/**
* Constructor
* @param position The position of the listener
*/
protected AudioListener(Vector3d position) {
this.position = position;
alListener3f(AL_POSITION, this.position.x, this.position.y, this.position.z);
alListener3f(AL_POSITION, (float)this.position.x, (float)this.position.y, (float)this.position.z);
alListener3f(AL_VELOCITY, 0, 0, 0);
}
public void setSpeed(Vector3f speed) {
/**
* Sets the speed of the listener
* @param speed the speed
*/
protected void setSpeed(Vector3f speed) {
alListener3f(AL_VELOCITY, speed.x, speed.y, speed.z);
}
public void setPosition(Vector3f position) {
/**
* Sets the position of the listener
* @param position the position
*/
protected void setPosition(Vector3f position) {
this.position.set(position.x, position.y, position.z);
alListener3f(AL_POSITION, position.x, position.y, position.z);
}
public void setOrientation(Vector3f at, Vector3f up) {
/**
* Sets the orientation of the listener
* @param at the forward vector of the camera
* @param up The up vector of the camera
*/
protected void setOrientation(Vector3f at, Vector3f up) {
this.eye.set(at);
this.up.set(up);
float[] data = new float[6];
data[0] = at.x;
data[1] = at.y;
@ -41,5 +71,29 @@ public class AudioListener {
alListenerfv(AL_ORIENTATION, data);
}
/**
* Gets the position of the listener
* @return The position
*/
public Vector3d getPosition(){
return position;
}
/**
* Gets the eye vector
* @return the eye vector
*/
public Vector3f getEyeVector(){
return eye;
}
/**
* Gets the up vector
* @return the up vector
*/
public Vector3f getUpVector(){
return up;
}
}

View File

@ -1,67 +1,120 @@
package electrosphere.audio;
import org.joml.Vector3f;
import static org.lwjgl.openal.AL10.*;
import org.lwjgl.openal.AL11;
import electrosphere.logger.LoggerInterface;
import org.lwjgl.openal.AL10;
/**
*
* @author amaterasu
* A source of audio in the aural space
*/
public class AudioSource {
//The id for the source
int sourceId;
public AudioSource(boolean loop, boolean relative){
this.sourceId = alGenSources();
/**
* Creates an audio source object
* @param loop if true, will loop audio, otherwise will not
* @param relative if true, will make the audio source position relative to the listener based on position
*/
protected AudioSource(boolean loop, boolean relative){
this.sourceId = AL10.alGenSources();
if (loop) {
alSourcei(sourceId, AL_LOOPING, AL_TRUE);
AL10.alSourcei(sourceId, AL10.AL_LOOPING, AL10.AL_TRUE);
}
if (relative) {
alSourcei(sourceId, AL_SOURCE_RELATIVE, AL_TRUE);
AL10.alSourcei(sourceId, AL10.AL_SOURCE_RELATIVE, AL10.AL_TRUE);
}
}
/**
* Sets the buffer that this source pulls from
* @param bufferId The id of the buffer
*/
public void setBuffer(int bufferId) {
stop();
alSourcei(sourceId, AL_BUFFER, bufferId);
AL10.alSourcei(sourceId, AL10.AL_BUFFER, bufferId);
}
/**
* Sets the position of the audio source
* @param position the position
*/
public void setPosition(Vector3f position) {
alSource3f(sourceId, AL_POSITION, position.x, position.y, position.z);
AL10.alSource3f(sourceId, AL10.AL_POSITION, position.x, position.y, position.z);
}
/**
* Sets the speed of the audio source
* @param speed the speed
*/
public void setSpeed(Vector3f speed) {
alSource3f(sourceId, AL_VELOCITY, speed.x, speed.y, speed.z);
AL10.alSource3f(sourceId, AL10.AL_VELOCITY, speed.x, speed.y, speed.z);
}
/**
* Sets the temporal offset of the source (ie how far into the clip to start playingf)
* @param time The time in seconds
*/
public void setOffset(float time){
AL10.alSourcef(sourceId, AL11.AL_SEC_OFFSET, time);
}
/**
* Sets the gain of the audio source
* @param gain the gain
*/
public void setGain(float gain) {
alSourcef(sourceId, AL_GAIN, gain);
LoggerInterface.loggerAudio.DEBUG("Set Gain: " + gain);
AL10.alSourcef(sourceId, AL10.AL_GAIN, gain);
}
/**
* Sets an arbitrary property on the audio source
* @param param The param flag
* @param value The value to set the param to
*/
public void setProperty(int param, float value) {
alSourcef(sourceId, param, value);
AL10.alSourcef(sourceId, param, value);
}
/**
* Plays the audio source
*/
public void play() {
alSourcePlay(sourceId);
AL10.alSourcePlay(sourceId);
}
/**
* Gets whether the audio source is currently playing or not
* @return True if it is playing, false otherwise
*/
public boolean isPlaying() {
return alGetSourcei(sourceId, AL_SOURCE_STATE) == AL_PLAYING;
return AL10.alGetSourcei(sourceId, AL10.AL_SOURCE_STATE) == AL10.AL_PLAYING;
}
/**
* Pauses the audio source
*/
public void pause() {
alSourcePause(sourceId);
AL10.alSourcePause(sourceId);
}
/**
* Stops the audio source
*/
public void stop() {
alSourceStop(sourceId);
AL10.alSourceStop(sourceId);
}
/**
* Cleans up the source
*/
public void cleanup() {
stop();
alDeleteSources(sourceId);
AL10.alDeleteSources(sourceId);
}
}

View File

@ -7,17 +7,22 @@ import electrosphere.util.FileUtils;
import org.joml.Vector3f;
/**
*
* @author amaterasu
* Utility functions for playing audio
*/
public class AudioUtils {
public static AudioSource playAudioAtLocation(String audioFile, Vector3f position){
/**
* Plays a audio at a given audio file path at a given position
* @param audioFile The audio file's path
* @param position The position to play it at
* @param loops If true, loops the source
* @return The audio source
*/
protected static AudioSource playAudioAtLocation(String audioFile, Vector3f position, boolean loops){
AudioSource rVal = null;
AudioBuffer buffer = Globals.assetManager.fetchAudio(audioFile);
if(buffer != null){
rVal = new AudioSource(false,false);
rVal = new AudioSource(loops,false);
rVal.setBuffer(buffer.getBufferId());
rVal.setGain(Globals.audioEngine.getGain());
rVal.setPosition(position);
@ -28,14 +33,19 @@ public class AudioUtils {
return rVal;
}
public static AudioSource playAudio(String audioFile){
/**
* Plays an audio file
* @param audioFile The audio file path
* @param loops If true, loops the source
* @return The audio source
*/
protected static AudioSource playAudio(String audioFile, boolean loops){
AudioSource rVal = null;
AudioBuffer buffer = Globals.assetManager.fetchAudio(FileUtils.sanitizeFilePath(audioFile));
if(buffer != null){
rVal = new AudioSource(false,false);
rVal = new AudioSource(loops,false);
rVal.setBuffer(buffer.getBufferId());
rVal.setGain(Globals.audioEngine.getGain());
rVal.setPosition(new Vector3f());
rVal.play();
} else {
LoggerInterface.loggerEngine.WARNING("Failed to start audio in playAudioAtLocation");

View File

@ -0,0 +1,222 @@
package electrosphere.audio;
import org.joml.Vector3d;
import electrosphere.audio.VirtualAudioSourceManager.VirtualAudioSourceType;
import electrosphere.engine.Globals;
import electrosphere.logger.LoggerInterface;
/**
* Represents an audio emitter that is being tracked by the game engine. Does not map 1-to-1 with audio that is actually played.
* This allows the engine to have 500 "audio sources" while only having a select few actually emit audio.
*/
public class VirtualAudioSource implements Comparable<VirtualAudioSource> {
//The priority of this virtual audio source
int priority = 100;
//the gain to play this audio source at
float gain = 1.0f;
//the path of the audio source
String filePath;
//whether the audio source loops or not
boolean loops = false;
//the position of the audio source
Vector3d position = null;
//the total time this audio source has played
float totalTimePlayed = 0;
//the rate to raise lower gain each frame
float fadeRate = 0;
//the modifier applied to gain based on fade rate
float fadeModifier = 1.0f;
//The type of virtual audio source
VirtualAudioSourceType type;
/**
* Plays an absolute (non-relative to listener) audio
* @param filePath The filepath of the audio source
* @param type The type of virtual audio source
* @param loops if true, loop the audio source
*/
protected VirtualAudioSource(String filePath, VirtualAudioSourceType type, boolean loops){
this.filePath = filePath;
this.type = type;
this.loops = loops;
}
/**
* Plays an relative audio source
* @param filePath The filepath of the audio source
* @param type The type of virtual audio source
* @param loops of true, loop the audio source
* @param position The position to play the audio source at
*/
protected VirtualAudioSource(String filePath, VirtualAudioSourceType type, boolean loops, Vector3d position){
this.filePath = filePath;
this.type = type;
this.loops = loops;
this.position = position;
}
/**
* Updates the audio source's current state
* @param deltaTime the time that has elapsed since the last frame
* @return true if the source is still playing, false otherwise
*/
protected boolean update(float deltaTime){
boolean isStillPlaying = true;
this.totalTimePlayed += deltaTime;
AudioBuffer buffer = Globals.assetManager.fetchAudio(filePath);
// LoggerInterface.loggerAudio.DEBUG("Increment virtual audio source " + deltaTime);
if(buffer != null){
if(this.totalTimePlayed >= buffer.length){
if(loops){
this.totalTimePlayed = this.totalTimePlayed % buffer.length;
} else {
isStillPlaying = false;
LoggerInterface.loggerAudio.DEBUG("Virtual Audio Source Timeout " + totalTimePlayed + " > " + buffer.length);
}
}
}
if(gain <= 0){
LoggerInterface.loggerAudio.DEBUG("Virtual Audio Source Gainout " + gain);
isStillPlaying = false;
}
//gradually fade sources
if(fadeRate == 0){
} else if(fadeRate < 0){
this.fadeModifier += fadeRate;
if(this.fadeModifier < 0){
isStillPlaying = false;
fadeRate = 0;
this.fadeModifier = 1.0f;
}
} else if(fadeRate == 1){
this.fadeModifier = 1.0f;
this.fadeRate = 0;
} else if(fadeRate > 0){
this.fadeModifier += fadeRate;
if(this.fadeModifier > 1){
this.fadeModifier = 1;
this.fadeRate = 0;
}
}
return isStillPlaying;
}
/**
* Sets the priority of this virtual audio source
* @param priority The priority
*/
public void setPriority(int priority){
this.priority = priority;
}
/**
* Sets the gain of this audio source
* @param gain The gain
*/
public void setGain(float gain){
this.gain = gain;
}
/**
* Sets the position of this virtual source
* @param position The position of this source
*/
public void setPosition(Vector3d position){
this.position.set(position);
}
/**
* Sets the total time that this virtual audio source has played
* @param totalTimePlayed The time
*/
public void setTotalTimePlayed(float totalTimePlayed){
this.totalTimePlayed = totalTimePlayed;
}
/**
* Gets whether this virtual audio source is relative to the speak or should be played non-spatially
* @return True if relative to speaker, false otherwise
*/
public boolean isRelative(){
return this.position != null;
}
/**
* Sets the source to start to fade out
* @param fadeRate The rate to lower gain each frame
*/
public void setFadeRate(float fadeRate){
this.fadeRate = fadeRate;
}
/**
* Gets the position of this source
* @return the position
*/
public Vector3d getPosition(){
return position;
}
/**
* Gets the priority of this source
* @return The priority
*/
public int getPriority(){
return priority;
}
/**
* Gets the type of this source
* @return The type
*/
public VirtualAudioSourceType getType(){
return type;
}
/**
* Gets the gain of the source
* @return The gain
*/
public float getGain(){
return gain * fadeModifier;
}
/**
* Gets the total time this virtual audio source has played for
* @return The total time
*/
public float getTotalTimePlayed(){
return totalTimePlayed;
}
/**
* Gets the length of the buffer this virtual audio source relates to
* @return The buffer length
*/
public float getBufferLength(){
AudioBuffer buffer = Globals.assetManager.fetchAudio(filePath);
if(buffer != null){
return buffer.length;
}
return 0;
}
@Override
public int compareTo(VirtualAudioSource o) {
return this.priority - o.priority;
}
}

View File

@ -0,0 +1,219 @@
package electrosphere.audio;
import java.util.Comparator;
import java.util.HashMap;
import java.util.LinkedList;
import java.util.List;
import java.util.Map;
import org.joml.Vector3d;
import org.joml.Vector3f;
import electrosphere.engine.Globals;
import electrosphere.entity.types.camera.CameraEntityUtils;
import electrosphere.logger.LoggerInterface;
/**
* Manages all the virtual audio sources in the engine.
* Divides all virtual sources into buckets of different types: ui, environment, creatures, etc.
* Then sorts them by priority and phases them in or out based on priority.
*/
public class VirtualAudioSourceManager {
/**
* Types of virtual audio sources
*/
public enum VirtualAudioSourceType {
UI,
ENVIRONMENT_SHORT,
ENVIRONMENT_LONG,
CREATURE,
}
//the list of categories of virtual audio sources
List<VirtualAudioSourceCategory> categories = new LinkedList<VirtualAudioSourceCategory>();
//The list of all virtual sources
List<VirtualAudioSource> virtualSourceQueue = new LinkedList<VirtualAudioSource>();
//the map of virtual source to active source for all active sources
Map<VirtualAudioSource,AudioSource> virtualActiveMap = new HashMap<VirtualAudioSource,AudioSource>();
//Temporary list used to store sources that need to be destroyed
List<VirtualAudioSource> sourcesToKill = new LinkedList<VirtualAudioSource>();
/**
* Creates the manager
*/
public VirtualAudioSourceManager(){
//add all categories
categories.add(new VirtualAudioSourceCategory(VirtualAudioSourceType.UI,4,-0.1f,1.0f));
categories.add(new VirtualAudioSourceCategory(VirtualAudioSourceType.ENVIRONMENT_SHORT,6,-0.1f,1.0f));
categories.add(new VirtualAudioSourceCategory(VirtualAudioSourceType.ENVIRONMENT_LONG,8,-0.05f,0.05f));
categories.add(new VirtualAudioSourceCategory(VirtualAudioSourceType.CREATURE,8,-0.1f,1.0f));
}
/**
* Creates a spatial virtual audio source
* @param filePath The file path for the audio source
* @param type The type of audio source (ui, environment, etc)
* @param loops If true, loops the audio source
* @param position The position of the audio source
*/
public VirtualAudioSource createVirtualAudioSource(String filePath, VirtualAudioSourceType type, boolean loops, Vector3d position){
VirtualAudioSource source = new VirtualAudioSource(filePath, type, loops, position);
LoggerInterface.loggerAudio.DEBUG("Create virtual audio source " + filePath);
this.virtualSourceQueue.add(source);
return source;
}
/**
* Creates a non-spatial virtual audio source
* @param filePath The file path for the audio source
* @param type The type of audio source (ui, environment, etc)
* @param loops If true, loops the audio source
*/
public VirtualAudioSource createVirtualAudioSource(String filePath, VirtualAudioSourceType type, boolean loops){
VirtualAudioSource source = new VirtualAudioSource(filePath, type, loops);
LoggerInterface.loggerAudio.DEBUG("Create virtual audio source " + filePath);
this.virtualSourceQueue.add(source);
return source;
}
/**
* Updates all virtual audio sources this frame
*/
public void update(float deltaTime){
//update priority of all virtual audio sources based on distance from camera position
if(Globals.playerCamera!=null){
Vector3d cameraEarPos = new Vector3d(CameraEntityUtils.getCameraCenter(Globals.playerCamera)).add(CameraEntityUtils.getCameraEye(Globals.playerCamera));
for(VirtualAudioSource source : virtualSourceQueue){
if(source.position!=null){
source.setPriority((int)cameraEarPos.distance(source.position));
}
}
}
//go through each audio source and destroy ones that are no longer virtually playing
sourcesToKill.clear();
for(VirtualAudioSource source : virtualSourceQueue){
boolean stillActive = source.update(deltaTime);
if(!stillActive){
LoggerInterface.loggerAudio.DEBUG("Kill Virtual Audio Source");
sourcesToKill.add(source);
}
}
for(VirtualAudioSource source : sourcesToKill){
AudioSource realSource = virtualActiveMap.remove(source);
if(realSource != null){
realSource.stop();
}
virtualSourceQueue.remove(source);
for(VirtualAudioSourceCategory category : categories){
category.activeVirtualSources.remove(source);
}
}
//sort audio sources
virtualSourceQueue.sort(Comparator.naturalOrder());
LoggerInterface.loggerAudio.DEBUG("Virtual audio source count: " + virtualSourceQueue.size());
//for each bucket that has capacity, start available sources
for(VirtualAudioSourceCategory category : categories){
LoggerInterface.loggerAudio.DEBUG("Audio category: " + category.type + " Active Virtual Sources: " + category.activeVirtualSources.size());
//
for(VirtualAudioSource source : virtualSourceQueue){
if(source.type != category.type){
continue;
}
//if it is an active source, set its gain to the virtual source's gain
if(virtualActiveMap.containsKey(source)){
AudioSource realSource = virtualActiveMap.get(source);
realSource.setGain(source.getGain());
}
//if there is a currently active source in this category that is lower priority than this source
//tell the engine to start fading out that lower priority source
if(!category.activeVirtualSources.contains(source)){
for(VirtualAudioSource activeSource : category.activeVirtualSources){
if(activeSource.priority > source.priority){
activeSource.setFadeRate(category.fadeOutRate);
break;
}
}
}
//add virtual source if necessary
if(category.activeVirtualSources.size() < category.capacity && !category.activeVirtualSources.contains(source)){
//activate source here
category.activeVirtualSources.add(source);
AudioSource realSource = null;
LoggerInterface.loggerAudio.DEBUG("MAP Audio to real source! ");
if(source.position == null){
realSource = AudioUtils.playAudio(source.filePath,source.loops);
} else {
realSource = AudioUtils.playAudioAtLocation(source.filePath, new Vector3f((float)source.position.x,(float)source.position.y,(float)source.position.z),source.loops);
}
source.setFadeRate(category.fadeInRate);
realSource.setGain(source.gain);
realSource.setOffset(source.totalTimePlayed);
virtualActiveMap.put(source, realSource);
}
}
}
}
/**
* Gets the queue of all currently active sources
* @return The queue
*/
public List<VirtualAudioSource> getSourceQueue(){
return virtualSourceQueue;
}
/**
* Gets the list of all real sources
* @return The list
*/
List<VirtualAudioSource> virtualSourcesMappedToRealSources = new LinkedList<VirtualAudioSource>();
public List<VirtualAudioSource> getMappedSources(){
virtualSourcesMappedToRealSources.clear();
for(VirtualAudioSourceCategory category : categories){
for(VirtualAudioSource activeSource : category.activeVirtualSources){
virtualSourcesMappedToRealSources.add(activeSource);
}
}
virtualSourcesMappedToRealSources.sort(Comparator.naturalOrder());
return virtualSourcesMappedToRealSources;
}
/**
* A category of virtual audio sources
*/
private static class VirtualAudioSourceCategory {
//the type of the category
VirtualAudioSourceType type;
//the number of real audio sources that can be playing for this category at a given time
int capacity;
//the rate to fade out sources in this category by when they become lower priority
float fadeOutRate;
//the rate to fade in sources in this category by when they become higher priority
float fadeInRate = 0;
//the list of virtual audio sources that are currently being played for this category
List<VirtualAudioSource> activeVirtualSources = new LinkedList<VirtualAudioSource>();
/**
* Constructor
* @param type
* @param capacity
*/
public VirtualAudioSourceCategory(VirtualAudioSourceType type, int capacity, float fadeOutRate, float fadeInRate){
this.type = type;
this.capacity = capacity;
this.fadeOutRate = fadeOutRate;
this.fadeInRate = fadeInRate;
}
}
}

View File

@ -56,6 +56,11 @@ public class ClientSimulation {
HitboxUtils.clientCollideEntities(currentHitbox);
}
}
//update audio engine
if(Globals.audioEngine!=null){
Globals.audioEngine.update();
Globals.virtualAudioSourceManager.update(Main.deltaFrames);
}
//update foliage
Globals.clientFoliageManager.update();
//tally collidables and offset position accordingly

View File

@ -72,6 +72,7 @@ import org.joml.Vector3f;
import org.lwjgl.glfw.GLFW;
import electrosphere.audio.AudioUtils;
import electrosphere.audio.VirtualAudioSourceManager.VirtualAudioSourceType;
import electrosphere.client.targeting.crosshair.Crosshair;
import electrosphere.client.terrain.editing.TerrainEditing;
import electrosphere.collision.CollisionEngine;
@ -880,7 +881,7 @@ public class ControlHandler {
Globals.controlHandler.setHandlerState(ControlsState.IN_GAME_MAIN_MENU);
Globals.controlHandler.showMouse();
//play sound effect
AudioUtils.playAudio("/Audio/openMenu.ogg");
Globals.virtualAudioSourceManager.createVirtualAudioSource("/Audio/openMenu.ogg", VirtualAudioSourceType.UI, false);
}});
controls.get(DATA_STRING_INPUT_CODE_IN_GAME_MAIN_MENU).setRepeatTimeout(0.5f * Main.targetFrameRate);
@ -902,7 +903,7 @@ public class ControlHandler {
Globals.controlHandler.setHandlerState(ControlsState.INVENTORY);
Globals.controlHandler.showMouse();
//play sound effect
AudioUtils.playAudio("/Audio/openMenu.ogg");
Globals.virtualAudioSourceManager.createVirtualAudioSource("/Audio/openMenu.ogg", VirtualAudioSourceType.UI, false);
//
Globals.openInventoriesCount++;
}
@ -926,7 +927,7 @@ public class ControlHandler {
Globals.controlHandler.setHandlerState(ControlsState.INVENTORY);
Globals.controlHandler.showMouse();
//play sound effect
AudioUtils.playAudio("/Audio/openMenu.ogg");
Globals.virtualAudioSourceManager.createVirtualAudioSource("/Audio/openMenu.ogg", VirtualAudioSourceType.UI, false);
//
Globals.openInventoriesCount++;
}
@ -1022,7 +1023,7 @@ public class ControlHandler {
Globals.controlHandler.setHandlerState(ControlsState.IN_GAME_MAIN_MENU);
Globals.controlHandler.showMouse();
//play sound effect
AudioUtils.playAudio("/Audio/openMenu.ogg");
Globals.virtualAudioSourceManager.createVirtualAudioSource("/Audio/openMenu.ogg", VirtualAudioSourceType.UI, false);
}});
controls.get(DEBUG_OPEN_DEBUG_MENU).setRepeatTimeout(0.5f * Main.targetFrameRate);
}
@ -1509,26 +1510,12 @@ public class ControlHandler {
return rVal;
}
public void setShouldRecapture(boolean shouldRecapture){
public void setRecapture(boolean shouldRecapture){
this.shouldRecaptureScreen = shouldRecapture;
}
public void recaptureIfNecessary(){
if(shouldRecaptureScreen){
//Makes the window that was just created the current OS-level window context
glfwMakeContextCurrent(Globals.window);
//Maximize it
glfwMaximizeWindow(Globals.window);
//grab focus
GLFW.glfwFocusWindow(Globals.window);
//apply mouse controls state
if(Globals.controlHandler.isMouseVisible()){
Globals.controlHandler.showMouse();
} else {
Globals.controlHandler.hideMouse();
}
shouldRecaptureScreen = false;
}
public boolean shouldRecapture(){
return this.shouldRecaptureScreen;
}

View File

@ -9,6 +9,7 @@ import org.joml.Vector3d;
import org.joml.Vector3f;
import electrosphere.audio.AudioEngine;
import electrosphere.audio.VirtualAudioSourceManager;
import electrosphere.auth.AuthenticationManager;
import electrosphere.client.culling.ClientEntityCullingManager;
import electrosphere.client.fluid.manager.ClientFluidManager;
@ -90,6 +91,7 @@ public class Globals {
//Audio Engine
//
public static AudioEngine audioEngine;
public static VirtualAudioSourceManager virtualAudioSourceManager;
//
@ -420,14 +422,12 @@ public class Globals {
public static void initDefaultAudioResources(){
LoggerInterface.loggerStartup.INFO("Loading default audio resources");
Globals.assetManager.addAudioPathToQueue("/Audio/MenuBackspace.ogg");
Globals.assetManager.addAudioPathToQueue("/Audio/MenuBadOption.ogg");
Globals.assetManager.addAudioPathToQueue("/Audio/MenuChangeOption.ogg");
Globals.assetManager.addAudioPathToQueue("/Audio/MenuType.ogg");
Globals.assetManager.addAudioPathToQueue("/Audio/inventoryGrabItem.ogg");
Globals.assetManager.addAudioPathToQueue("/Audio/inventorySlotItem.ogg");
Globals.assetManager.addAudioPathToQueue("/Audio/openMenu.ogg");
Globals.assetManager.addAudioPathToQueue("/Audio/closeMenu.ogg");
Globals.assetManager.addAudioPathToQueue("/Audio/ambienceWind1SeamlessMono.ogg");
Globals.assetManager.loadAssetsInQueue();
}
public static void initDefaultGraphicalResources(){

View File

@ -4,11 +4,13 @@ import static org.lwjgl.glfw.GLFW.glfwGetTime;
import static org.lwjgl.glfw.GLFW.glfwTerminate;
import static org.lwjgl.glfw.GLFW.glfwWindowShouldClose;
import java.lang.management.ManagementFactory;
import java.util.concurrent.TimeUnit;
import org.ode4j.ode.OdeHelper;
import electrosphere.audio.AudioEngine;
import electrosphere.audio.VirtualAudioSourceManager;
import electrosphere.client.sim.ClientFunctions;
import electrosphere.controls.ControlHandler;
import electrosphere.engine.cli.CLIParser;
@ -17,10 +19,7 @@ import electrosphere.game.config.UserSettings;
import electrosphere.game.server.world.MacroData;
import electrosphere.logger.LoggerInterface;
import electrosphere.renderer.RenderingEngine;
import electrosphere.renderer.ui.imgui.ImGuiLinePlot;
import electrosphere.renderer.ui.imgui.ImGuiWindow;
import electrosphere.renderer.ui.imgui.ImGuiWindowMacros;
import electrosphere.renderer.ui.imgui.ImGuiLinePlot.ImGuiLinePlotDataset;
import electrosphere.server.simulation.MacroSimulation;
@ -72,6 +71,9 @@ public class Main {
//initialize logging interfaces
LoggerInterface.initLoggers();
//gets pid of engine
System.out.println(ManagementFactory.getRuntimeMXBean().getName());
//load user settings
UserSettings.loadUserSettings();
@ -142,8 +144,14 @@ public class Main {
if(Globals.RUN_CLIENT && !Globals.HEADLESS){
Globals.renderingEngine = new RenderingEngine();
Globals.renderingEngine.createOpenglContext();
Globals.initDefaultGraphicalResources();
ImGuiWindowMacros.initImGuiWindows();
}
Runtime.getRuntime().addShutdownHook(new Thread(() -> {
System.out.println("Shutdown hook!");
}));
//uncomment to test loading a model into engine
// if(1==1){
// Globals.assetManager.addModelPathToQueue("/Models/tank1.fbx");
@ -168,20 +176,12 @@ public class Main {
//create the audio context
if(Globals.RUN_CLIENT && !Globals.HEADLESS){
Globals.virtualAudioSourceManager = new VirtualAudioSourceManager();
Globals.audioEngine = new AudioEngine();
Globals.audioEngine.init();
// Globals.audioEngine.setGain(0.1f);
}
//init default resources
if(Globals.RUN_CLIENT && !Globals.HEADLESS){
Globals.initDefaultGraphicalResources();
Globals.audioEngine.listAllDevices();
Globals.initDefaultAudioResources();
}
//init imgui debug windows
if(Globals.RUN_CLIENT && !Globals.HEADLESS){
ImGuiWindowMacros.initImGuiWindows();
// Globals.audioEngine.setGain(0.1f);
}
//fire off a loading thread for the title menus/screen
@ -199,7 +199,7 @@ public class Main {
//recapture the screen for rendering
if(Globals.RUN_CLIENT && !Globals.HEADLESS){
LoggerInterface.loggerStartup.INFO("Recapture screen");
Globals.controlHandler.setShouldRecapture(true);
Globals.controlHandler.setRecapture(true);
}
// RenderUtils.recaptureScreen();
}
@ -222,6 +222,8 @@ public class Main {
//main loop
while (running) {
LoggerInterface.loggerEngine.DEBUG("Begin Main Loop Frame");
//sets whether to capture framerates of current frame
captureFramerate = frameCount % 10 == 0;
@ -252,6 +254,7 @@ public class Main {
/// A S S E T M A N A G E R S T U F F
///
if(Globals.RUN_CLIENT){
LoggerInterface.loggerEngine.DEBUG("Begin load assets");
Globals.assetManager.loadAssetsInQueue();
}
@ -263,6 +266,7 @@ public class Main {
///
//Why is this its own function? Just to get the networking code out of main()
if(Globals.clientConnection != null){
LoggerInterface.loggerEngine.DEBUG("Begin parse client messages");
Globals.clientConnection.parseMessages();
}
@ -278,8 +282,9 @@ public class Main {
///
//Poll controls
if(Globals.RUN_CLIENT){
LoggerInterface.loggerEngine.DEBUG("Begin recapture screen");
Globals.controlHandler.pollControls();
Globals.controlHandler.recaptureIfNecessary();
RenderingEngine.recaptureIfNecessary();
}
@ -287,6 +292,7 @@ public class Main {
///
/// C L I E N T S I M U L A T I O N S T U F F
///
LoggerInterface.loggerEngine.DEBUG("Begin client simulation");
if(!Globals.HEADLESS && captureFramerate){
functionTrackTimeStart = glfwGetTime();
}
@ -319,6 +325,7 @@ public class Main {
///
/// S E R V E R M I C R O S I M U L A T I O N
///
LoggerInterface.loggerEngine.DEBUG("Begin server micro simulation");
if(!!Globals.HEADLESS && captureFramerate){
functionTrackTimeStart = glfwGetTime();
}
@ -329,6 +336,7 @@ public class Main {
///
/// M A C R O S I M U L A T I O N S T U F F
///
LoggerInterface.loggerEngine.DEBUG("Begin server macro simulation");
if(Globals.macroSimulation != null && Globals.macroSimulation.isReady() && framestep > 0){
Globals.macroSimulation.simulate();
}
@ -342,6 +350,7 @@ public class Main {
///
/// M A I N R E N D E R F U N C T I O N
///
LoggerInterface.loggerEngine.DEBUG("Begin rendering call");
if(!Globals.HEADLESS && captureFramerate){
functionTrackTimeStart = glfwGetTime();
}
@ -394,8 +403,12 @@ public class Main {
running = false;
}
LoggerInterface.loggerEngine.DEBUG("End Main Loop Frame");
}
LoggerInterface.loggerEngine.ERROR("ENGINE SHUTDOWN", new Exception());
//
// S H U T D O W N
//

View File

@ -7,6 +7,9 @@ import org.joml.Quaterniond;
import org.joml.Vector3d;
import org.joml.Vector3f;
import electrosphere.audio.AudioUtils;
import electrosphere.audio.VirtualAudioSource;
import electrosphere.audio.VirtualAudioSourceManager.VirtualAudioSourceType;
import electrosphere.client.culling.ClientEntityCullingManager;
import electrosphere.client.foliagemanager.ClientFoliageManager;
import electrosphere.client.sim.ClientSimulation;
@ -69,7 +72,7 @@ public class ClientLoading {
//make character creation window visible
WindowUtils.recursiveSetVisible(Globals.elementManager.getWindow(WindowStrings.WINDOW_MENU_MAIN), true);
//recapture window
Globals.controlHandler.setShouldRecapture(true);
Globals.controlHandler.setRecapture(true);
//log
LoggerInterface.loggerEngine.INFO("[Client]Finished loading character creation menu");
//set menu controls again
@ -103,7 +106,7 @@ public class ClientLoading {
//make loading window disappear
loadingWindow.setVisible(false);
//recapture screen
Globals.controlHandler.setShouldRecapture(true);
Globals.controlHandler.setRecapture(true);
//set rendering flags to main game mode
Globals.RENDER_FLAG_RENDER_SHADOW_MAP = true;
Globals.RENDER_FLAG_RENDER_SCREEN_FRAMEBUFFER_CONTENT = true;
@ -241,26 +244,6 @@ public class ClientLoading {
}
});
Random rand = new Random(0);
// {
// Entity tree = ProceduralTree.clientGenerateProceduralTree("oak", rand.nextLong());
// EntityUtils.getPosition(tree).set(5,0,5);
// }
// for(int i = 0; i < 6; i++){
// Entity tree = ProceduralTree.clientGenerateProceduralTree("oak", rand.nextLong());
// // EntityUtils.getPosition(tree).set(5,0,5);
// // EntityUtils.getScale(tree).set(0.5f);
// // EntityUtils.getRotation(tree).rotateLocalX(0.5);
// EntityUtils.getPosition(tree).set(5,0,i * 5);
// }
// for(int x = 0; x < 5; x++){
// for(int z = 0; z < 5; z++){
// Entity tree = ProceduralTree.clientGenerateProceduralTree("oak", rand.nextLong());
// ClientEntityUtils.initiallyPositionEntity(tree, new Vector3d(5 + x * 5,0,5 + z * 5));
// EntityUtils.getScale(tree).set(0.5f);
// }
// }
}

View File

@ -201,6 +201,11 @@ public class EntityDataStrings {
public static final String ATTACK_MOVE_TYPE_MELEE_SWING_ONE_HAND = "MELEE_WEAPON_SWING_ONE_HAND";
public static final String ATTACK_MOVE_TYPE_BOW_TWO_HAND = "RANGED_WEAPON_BOW_TWO_HAND";
/**
* Ambient audio
*/
public static final String CLIENT_AMBIENT_AUDIO_TREE = "clientAmbientAudioTree";
/*
* Shooter tree
*/

View File

@ -0,0 +1,75 @@
package electrosphere.entity.state.ambientaudio;
import org.joml.Vector3d;
import electrosphere.audio.VirtualAudioSource;
import electrosphere.audio.VirtualAudioSourceManager.VirtualAudioSourceType;
import electrosphere.engine.Globals;
import electrosphere.entity.Entity;
import electrosphere.entity.EntityDataStrings;
import electrosphere.entity.EntityUtils;
import electrosphere.entity.state.BehaviorTree;
import electrosphere.game.data.foliage.type.AmbientAudio;
/**
*
* A behavior tree encapsulating
*/
public class ClientAmbientAudioTree implements BehaviorTree {
//the parent entity to this ambient audio tree
Entity parent;
//the virtual audio source that is emitting audio
VirtualAudioSource virtualAudioSource;
//the offset of the sound relative to the parent entity origin point
Vector3d offset = new Vector3d(0,0,0);
private ClientAmbientAudioTree(Entity parent){
this.parent = parent;
}
@Override
public void simulate(float deltaTime) {
//TODO: eventually swap to pushing down entity position from move methods when they mode
Vector3d position = EntityUtils.getPosition(parent);
virtualAudioSource.setPosition(new Vector3d(position).add(offset));
}
/**
* Attaches this tree to the entity.
* @param entity The entity to attach to
* @param ambientAudio The ambient audio model
*/
public static ClientAmbientAudioTree attachTree(Entity parent, AmbientAudio ambientAudio){
ClientAmbientAudioTree rVal = new ClientAmbientAudioTree(parent);
if(ambientAudio.getResponseWindAudioFilePath()!=null){
Globals.assetManager.addAudioPathToQueue(ambientAudio.getResponseWindAudioFilePath());
rVal.virtualAudioSource = Globals.virtualAudioSourceManager.createVirtualAudioSource(
ambientAudio.getResponseWindAudioFilePath(),
VirtualAudioSourceType.ENVIRONMENT_LONG,
ambientAudio.getResponseWindLoops(),
new Vector3d(0,0,0)
);
if(ambientAudio.getRandomizeOffset()){
rVal.virtualAudioSource.setTotalTimePlayed((float)(Math.random() * 100));
}
if(ambientAudio.getEmitterSpatialOffset()!=null){
rVal.offset.set(
ambientAudio.getEmitterSpatialOffset()[0],
ambientAudio.getEmitterSpatialOffset()[1],
ambientAudio.getEmitterSpatialOffset()[2]
);
}
rVal.virtualAudioSource.setGain(ambientAudio.getGainMultiplier());
}
//!!WARNING!! THIS WAS MANUALLY MODIFIED OH GOD
parent.putData(EntityDataStrings.CLIENT_AMBIENT_AUDIO_TREE, rVal);
Globals.clientScene.registerBehaviorTree(rVal);
return rVal;
}
}

View File

@ -1,5 +1,7 @@
package electrosphere.entity.types.foliage;
import electrosphere.audio.VirtualAudioSource;
import electrosphere.audio.VirtualAudioSourceManager.VirtualAudioSourceType;
import electrosphere.collision.PhysicsUtils;
import electrosphere.collision.collidable.Collidable;
import electrosphere.engine.Globals;
@ -8,9 +10,11 @@ import electrosphere.entity.EntityCreationUtils;
import electrosphere.entity.EntityDataStrings;
import electrosphere.entity.EntityTags;
import electrosphere.entity.EntityUtils;
import electrosphere.entity.state.ambientaudio.ClientAmbientAudioTree;
import electrosphere.entity.state.idle.IdleTree;
import electrosphere.entity.types.collision.CollisionObjUtils;
import electrosphere.entity.types.tree.ProceduralTree;
import electrosphere.game.data.foliage.type.AmbientAudio;
import electrosphere.game.data.foliage.type.FoliageType;
import electrosphere.game.data.foliage.type.PhysicsObject;
import electrosphere.net.parser.net.message.EntityMessage;
@ -95,6 +99,12 @@ public class FoliageUtils {
break;
}
}
//audio
if(rawType.getAmbientAudio()!=null){
AmbientAudio ambientAudio = rawType.getAmbientAudio();
ClientAmbientAudioTree.attachTree(rVal, ambientAudio);
}
//
ServerEntityTagUtils.attachTagToEntity(rVal, EntityTags.FOLIAGE);
rVal.putData(EntityDataStrings.FOLIAGE_IS_FOLIAGE, true);
rVal.putData(EntityDataStrings.FOLIAGE_TYPE, rawType);

View File

@ -16,6 +16,8 @@ import org.joml.Vector4d;
import org.joml.Vector4f;
import org.ode4j.ode.DBody;
import electrosphere.audio.VirtualAudioSource;
import electrosphere.audio.VirtualAudioSourceManager.VirtualAudioSourceType;
import electrosphere.collision.CollisionBodyCreation;
import electrosphere.collision.PhysicsUtils;
import electrosphere.collision.collidable.Collidable;
@ -30,6 +32,7 @@ import electrosphere.entity.state.BehaviorTree;
import electrosphere.entity.types.attach.AttachUtils;
import electrosphere.entity.types.instance.InstanceTemplate;
import electrosphere.entity.types.instance.InstancedEntityUtils;
import electrosphere.game.data.foliage.type.AmbientAudio;
import electrosphere.game.data.foliage.type.FoliageType;
import electrosphere.game.data.foliage.type.TreeModel;
import electrosphere.renderer.actor.instance.InstancedActor;

View File

@ -0,0 +1,61 @@
package electrosphere.game.data.foliage.type;
/**
* Parameters for ambient audio generation by this foliage
*/
public class AmbientAudio {
//the path to the audio file to play in response to wind
String responseWindAudioFilePath;
//if true, the wind response will be set to loop
boolean responseWindLoops;
//if true, when it starts playing it will randomize where in the file it starts playing
//this is useful for instance for trees starting to play audio where they aren't all playing the exact same file at the exact same point in time
boolean randomizeOffset;
//multiplies the gain by an amount
float gainMultiplier;
//the spatial offset from the origin of the attached entity to place the audio emitter
float emitterSpatialOffset[];
/**
* the path to the audio file to play in response to wind
* @return
*/
public String getResponseWindAudioFilePath(){
return responseWindAudioFilePath;
}
/**
* if true, the wind response will be set to loop
* @return
*/
public boolean getResponseWindLoops(){
return responseWindLoops;
}
/**
* if true, when it starts playing it will randomize where in the file it starts playing
* this is useful for instance for trees starting to play audio where they aren't all playing the exact same file at the exact same point in time
* @return
*/
public boolean getRandomizeOffset(){
return randomizeOffset;
}
/**
* multiplies the gain by an amount
* @return
*/
public float getGainMultiplier(){
return gainMultiplier;
}
/**
* The offset to place the emitter relative to the parent
* @return The offset
*/
public float[] getEmitterSpatialOffset(){
return emitterSpatialOffset;
}
}

View File

@ -24,6 +24,8 @@ public class FoliageType {
List<String> tokens;
//The model for a tree
TreeModel treeModel;
//The ambient audio model
AmbientAudio ambientAudio;
/**
* Gets the name of the foliage type
@ -73,4 +75,12 @@ public class FoliageType {
return treeModel;
}
/**
* Gets the ambient audio model
* @return The ambient audio model
*/
public AmbientAudio getAmbientAudio(){
return ambientAudio;
}
}

View File

@ -18,6 +18,7 @@ public class LoggerInterface {
public static Logger loggerStartup;
public static Logger loggerAuth;
public static Logger loggerDB;
public static Logger loggerAudio;
public static void initLoggers(){
loggerStartup = new Logger(LogLevel.WARNING);
@ -28,6 +29,7 @@ public class LoggerInterface {
loggerEngine = new Logger(LogLevel.WARNING);
loggerAuth = new Logger(LogLevel.WARNING);
loggerDB = new Logger(LogLevel.WARNING);
loggerAudio = new Logger(LogLevel.DEBUG);
loggerStartup.INFO("Initialized loggers");
}
}

View File

@ -3,6 +3,7 @@ package electrosphere.menu;
import java.util.List;
import electrosphere.audio.AudioUtils;
import electrosphere.audio.VirtualAudioSourceManager.VirtualAudioSourceType;
import electrosphere.controls.ControlHandler.ControlsState;
import electrosphere.engine.Globals;
import electrosphere.entity.Entity;
@ -49,7 +50,7 @@ public class MenuGeneratorsInventory {
Globals.controlHandler.hideMouse();
}
//play sound effect
AudioUtils.playAudio("/Audio/closeMenu.ogg");
Globals.virtualAudioSourceManager.createVirtualAudioSource("/Audio/closeMenu.ogg", VirtualAudioSourceType.UI, false);
return false;
}});
@ -136,7 +137,7 @@ public class MenuGeneratorsInventory {
div.removeChild(panel);
WindowUtils.pushItemIconToItemWindow(panel);
//play sound effect
AudioUtils.playAudio("/Audio/inventoryGrabItem.ogg");
Globals.virtualAudioSourceManager.createVirtualAudioSource("/Audio/inventoryGrabItem.ogg", VirtualAudioSourceType.UI, false);
return false;
}});
panel.setOnDrag(new DragEventCallback() {public boolean execute(DragEvent event){
@ -166,7 +167,7 @@ public class MenuGeneratorsInventory {
//re-render inventory
WindowUtils.replaceWindow(WindowUtils.getInventoryWindowID(inventory.getId()), MenuGeneratorsInventory.createNaturalInventoryMenu(inventory));
//play sound effect
AudioUtils.playAudio("Audio/inventorySlotItem.ogg");
Globals.virtualAudioSourceManager.createVirtualAudioSource("/Audio/inventorySlotItem.ogg", VirtualAudioSourceType.UI, false);
}
//now the fun begins :)
//if transfer item
@ -214,7 +215,7 @@ public class MenuGeneratorsInventory {
Globals.controlHandler.hideMouse();
}
//play sound effect
AudioUtils.playAudio("/Audio/closeMenu.ogg");
Globals.virtualAudioSourceManager.createVirtualAudioSource("/Audio/closeMenu.ogg", VirtualAudioSourceType.UI, false);
return false;
}});
@ -295,7 +296,7 @@ public class MenuGeneratorsInventory {
div.removeChild(panel);
WindowUtils.pushItemIconToItemWindow(panel);
//play sound effect
AudioUtils.playAudio("Audio/inventoryGrabItem.ogg");
Globals.virtualAudioSourceManager.createVirtualAudioSource("/Audio/inventoryGrabItem.ogg", VirtualAudioSourceType.UI, false);
return false;
}});
panel.setOnDrag(new DragEventCallback() {public boolean execute(DragEvent event){
@ -317,7 +318,7 @@ public class MenuGeneratorsInventory {
EquipState equipState = EquipState.getEquipState(Globals.playerEntity);
equipState.commandAttemptEquip(item,inventory.getEquipPointFromSlot(slots.get(itemId)));
//play sound effect
AudioUtils.playAudio("Audio/inventorySlotItem.ogg");
Globals.virtualAudioSourceManager.createVirtualAudioSource("/Audio/inventorySlotItem.ogg", VirtualAudioSourceType.UI, false);
}
//update ui
Globals.dragSourceInventory = null;

View File

@ -91,13 +91,11 @@ public class ClientNetworking implements Runnable{
// inputStream = new CryptoInputStream("AES/ECB/PKCS5Padding",properties,socket.getInputStream(),key,spec);
// } catch (IOException ex) {
// ex.printStackTrace();
// System.exit(1);
// }
// try {
// outputStream = new CryptoOutputStream("AES/ECB/PKCS5Padding",properties,socket.getOutputStream(),key,spec);
// } catch (IOException ex) {
// ex.printStackTrace();
// System.exit(1);
// }
//Used to copy messages from network parser to NetMonitor
@ -122,7 +120,6 @@ public class ClientNetworking implements Runnable{
}
if(connectionAttempts > MAX_CONNECTION_ATTEMPTS){
LoggerInterface.loggerNetworking.ERROR("Max client connection attempts!", new Exception());
// System.exit(1);
}
}

View File

@ -1,5 +1,6 @@
package electrosphere.net.parser.net.raw;
import electrosphere.logger.LoggerInterface;
import electrosphere.net.parser.net.message.NetworkMessage;
import java.io.IOException;
import java.io.InputStream;
@ -51,7 +52,7 @@ public class NetworkParser {
}
} catch (IOException ex) {
ex.printStackTrace();
System.exit(0);
LoggerInterface.loggerNetworking.ERROR("", ex);
}
}

View File

@ -60,7 +60,7 @@ public class Server implements Runnable{
} catch (IOException ex) {
LoggerInterface.loggerNetworking.ERROR("Failed to start server socket!",ex);
ex.printStackTrace();
System.exit(1);
LoggerInterface.loggerNetworking.ERROR("", ex);
}
while(Main.isRunning()){
Socket newSocket;

View File

@ -136,7 +136,7 @@ public class ServerConnectionHandler implements Runnable {
serverProtocol = new ServerProtocol(this);
} catch (IOException ex) {
ex.printStackTrace();
System.exit(1);
LoggerInterface.loggerNetworking.ERROR("", ex);
}
}

View File

@ -155,8 +155,7 @@ public class Mesh {
// }
// normalData.rewind();
if(numVertices != numNormals){
System.out.println("Catastrophic failure: Number of vertices =/= Number of normals");
System.exit(1);
LoggerInterface.loggerNetworking.ERROR("Catastrophic failure: Number of vertices =/= Number of normals", new Exception("Catastrophic failure: Number of vertices =/= Number of normals"));
}
@ -708,7 +707,6 @@ public class Mesh {
// System.out.println("Found torso bone");
// System.out.println(currentUniform);
// System.out.println(currentMat);
// System.exit(0);
// }
GL45.glUniformMatrix4fv(glGetUniformLocation(Globals.renderingEngine.getActiveShader().shaderProgram, currentUniform), false, bufferarray);
} else {

View File

@ -298,7 +298,6 @@ public class Model {
// // if(s.contains("Walk")){
// // currentAnimation.describeAnimation();
// //// currentAnimation.fullDescribeAnimation();
// // System.exit(0);
// // }
// currentAnimation.timeCurrent = 0;
// Iterator<AnimChannel> channelIterator = currentAnimation.channels.iterator();

View File

@ -1390,7 +1390,6 @@ public class RenderUtils {
// System.out.println("AAAAAAAAAAAAAAAAAA");
// System.out.println(finalQuads.size());
// System.exit(0);
int incrementer = 0;

View File

@ -563,6 +563,7 @@ public class RenderingEngine {
// checkError();
//check and call events and swap the buffers
LoggerInterface.loggerRenderer.DEBUG("Swap buffers");
glfwSwapBuffers(Globals.window);
glfwPollEvents();
}
@ -1621,6 +1622,27 @@ public class RenderingEngine {
return renderPipelineState;
}
/**
* Tries to recapture the screen
*/
public static void recaptureIfNecessary(){
if(Globals.controlHandler.shouldRecapture()){
//Makes the window that was just created the current OS-level window context
glfwMakeContextCurrent(Globals.window);
// //Maximize it
glfwMaximizeWindow(Globals.window);
//grab focus
GLFW.glfwFocusWindow(Globals.window);
//apply mouse controls state
if(Globals.controlHandler.isMouseVisible()){
Globals.controlHandler.showMouse();
} else {
Globals.controlHandler.hideMouse();
}
Globals.controlHandler.setRecapture(false);
}
}
/**
* Checks for any errors currently caught by OpenGL.
* Refer: https://docs.gl/gl4/glGetError

View File

@ -3,8 +3,9 @@ package electrosphere.renderer.ui.imgui;
import java.util.HashMap;
import java.util.Map;
import electrosphere.audio.VirtualAudioSource;
import electrosphere.engine.Globals;
import electrosphere.renderer.RenderingEngine;
import electrosphere.renderer.ui.imgui.ImGuiBarPlot.ImGuiBarPlotDatapoint;
import electrosphere.renderer.ui.imgui.ImGuiLinePlot.ImGuiLinePlotDataset;
import electrosphere.renderer.ui.imgui.ImGuiWindow.ImGuiWindowCallback;
import imgui.ImGui;
@ -29,6 +30,11 @@ public class ImGuiWindowMacros {
private static ImGuiBarPlot serverFrametimePlot;
private static double serverFrametimeTrackerStorage = 0;
//audio debug menu
private static ImGuiWindow audioDebugMenu;
private static boolean showAllVirtualAudioChildren = false;
private static boolean showMappedVirtualAudioChildren = true;
/**
* Initializes imgui windows
*/
@ -36,6 +42,7 @@ public class ImGuiWindowMacros {
createMainDebugMenu();
createFramerateGraph();
createServerFrametimeGraph();
createAudioDebugMenu();
}
/**
@ -124,6 +131,68 @@ public class ImGuiWindowMacros {
serverFrametimePlot.clearDatapoints();
}
/**
* Create audio debug menu
*/
private static void createAudioDebugMenu(){
audioDebugMenu = new ImGuiWindow("Audio");
audioDebugMenu.callback = new ImGuiWindowCallback() {
@Override
public void exec() {
//audio engine details
ImGui.text("Audio Engine Details");
ImGui.text("Current audio device: " + Globals.audioEngine.getDevice());
ImGui.text("Default audio device: " + Globals.audioEngine.getDefaultDevice());
ImGui.text("Has HRTF: " + Globals.audioEngine.getHRTFStatus());
ImGui.text("Listener location: " + Globals.audioEngine.getListener().getPosition());
ImGui.text("Listener eye vector: " + Globals.audioEngine.getListener().getEyeVector());
ImGui.text("Listener up vector: " + Globals.audioEngine.getListener().getUpVector());
ImGui.text("Virtual Audio Source Manager Details");
ImGui.text("Total number active virtual sources: " + Globals.virtualAudioSourceManager.getSourceQueue().size());
//only active children
if(showMappedVirtualAudioChildren){
ImGui.beginChild("mapped virtual sources");
for(VirtualAudioSource source : Globals.virtualAudioSourceManager.getMappedSources()){
ImGui.text("Source " + source.getPriority());
ImGui.text(" - Position " + source.getPosition());
ImGui.text(" - Gain " + source.getGain());
ImGui.text(" - Type " + source.getType());
ImGui.text(" - Total time played " + source.getTotalTimePlayed());
ImGui.text(" - Buffer Lenth " + source.getBufferLength());
}
ImGui.endChild();
if(ImGui.button("Hide Mapped Virtual Children")){
showMappedVirtualAudioChildren = false;
}
} else {
if(ImGui.button("Show Mapped Virtual Children")){
showMappedVirtualAudioChildren = true;
}
}
//all virtual children
if(showAllVirtualAudioChildren){
ImGui.beginChild("all virtual sources");
for(VirtualAudioSource source : Globals.virtualAudioSourceManager.getSourceQueue()){
ImGui.text("Position " + source.getPosition());
}
ImGui.endChild();
if(ImGui.button("Hide All Virtual Children")){
showAllVirtualAudioChildren = false;
}
} else {
if(ImGui.button("Show All Virtual Children")){
showAllVirtualAudioChildren = true;
}
}
//close button
if(ImGui.button("Close")){
RenderingEngine.removeImGuiWindow(audioDebugMenu);
}
}
};
}
/**
* Inits the main debug menu
@ -141,6 +210,10 @@ public class ImGuiWindowMacros {
if(ImGui.button("Show Server Frametime Breakdown")){
RenderingEngine.addImGuiWindow(serverFrametimeWindow);
}
//show audio debug
if(ImGui.button("Show Audio Debug Menu")){
RenderingEngine.addImGuiWindow(audioDebugMenu);
}
//close button
if(ImGui.button("Close")){
RenderingEngine.removeImGuiWindow(mainDebugWindow);

View File

@ -279,6 +279,7 @@ public class GriddedDataCellManager implements DataCellManager, VoxelCellManager
//isn't null
groundDataCells.get(getServerDataCellKey(worldPos)) != null
){
LoggerInterface.loggerEngine.DEBUG("Get server data cell key: " + getServerDataCellKey(worldPos));
rVal = groundDataCells.get(getServerDataCellKey(worldPos));
}
return rVal;
@ -392,6 +393,7 @@ public class GriddedDataCellManager implements DataCellManager, VoxelCellManager
private ServerDataCell createServerDataCell(Vector3i worldPos){
ServerDataCell rVal = parent.createNewCell();
groundDataCells.put(getServerDataCellKey(worldPos),rVal);
LoggerInterface.loggerEngine.DEBUG("Create server data cell with key " + getServerDataCellKey(worldPos));
cellPositionMap.put(rVal,new Vector3i(worldPos));
serverContentManager.generateContentForDataCell(parent, worldPos, rVal);
return rVal;

View File

@ -1,6 +1,7 @@
package electrosphere.server.datacell.utils;
import electrosphere.entity.Entity;
import electrosphere.logger.LoggerInterface;
import electrosphere.server.datacell.ServerDataCell;
/**

View File

@ -6,6 +6,7 @@ import com.google.gson.GsonBuilder;
import electrosphere.engine.Main;
import electrosphere.game.data.creature.type.movement.MovementSystem;
import electrosphere.game.data.creature.type.movement.MovementSystemSerializer;
import electrosphere.logger.LoggerInterface;
import electrosphere.util.annotation.AnnotationExclusionStrategy;
import java.io.BufferedReader;
@ -122,6 +123,11 @@ public class FileUtils {
// return rVal;
// }
/**
* Sanitizes a relative file path, guaranteeing that the initial slash is correct
* @param filePath The raw file path
* @return The sanitized file path
*/
public static String sanitizeFilePath(String filePath){
String rVal = new String(filePath);
rVal = rVal.trim();
@ -146,8 +152,7 @@ public class FileUtils {
try {
Files.write(path, gson.toJson(object).getBytes());
} catch (IOException ex) {
ex.printStackTrace();
System.exit(1);
LoggerInterface.loggerFileIO.ERROR(filePath, ex);
}
}

View File

@ -2,6 +2,7 @@ package electrosphere.util;
import electrosphere.engine.Globals;
import electrosphere.engine.Main;
import electrosphere.logger.LoggerInterface;
import electrosphere.renderer.Material;
import electrosphere.renderer.Mesh;
import electrosphere.renderer.Model;
@ -165,7 +166,7 @@ public class Utilities {
Files.write(file.toPath(), gson.toJson(object).getBytes());
} catch (IOException ex) {
ex.printStackTrace();
System.exit(1);
LoggerInterface.loggerFileIO.ERROR(fileName, ex);
}
}