🤑 Indie game store🙌 Free games😂 Fun games😨 Horror games
👷 Game development🎨 Assets📚 Comics
🎉 Sales🎁 Bundles

rainybyte

19
Posts
5
Topics
1
Following
A member registered 126 days ago

Recent community posts

Confirmed.

It looks like the demo project doesn't come with the profiler enabled. I also seem to be having some trouble with changing the code to enable it and see if the problem exists with the demo project too.

(Edited 3 times)

https://drive.google.com/open?id=0BzWL7E-FTHJhUkZIcDRKbHVWU1E

https://drive.google.com/open?id=0BzWL7E-FTHJhbGtmWDYtczlodEE

https://drive.google.com/open?id=0BzWL7E-FTHJhZUZrTTNVNFpLeUk

https://drive.google.com/open?id=0BzWL7E-FTHJhenVGdWVrN2xaTEk

Here are four screenshots. In order they should be:

In game - Dedicated

In game - Integrated

Title - Dedicated

Title - Integrated

I have some better calltraces, including OpenGL calls. Sorry I couldn't hunt this down earlier. The end of the semester got kinda rough. The calltraces are currently locked inside an Apple ".trace" file. I need to analyze them a bit more.

I can make myself available to address the performance issues anytime within the next few days. It looks like the integrated graphics performance is now great, but it still has trouble with the dedicated GPU. Something about the CPU/GPU handoff because the GPU show 4% utilization and renders frames in a few dozen microseconds (if it could keep that performance up it would be thousands of fps). This may also be related to the 100% CPU utilization issues in other places too. Is there a demo BDX project somewhere? I have IDEA and development software, so I could see if it's something about your game specifically, or if it's a problem with BDX/libGDX/LWJGL. Other games run quite well on this system, so it isn't one of those cases of Apple breaking things.

Intriguingly enough: after editing some of the shaders to allow me to load in, I'm able to get the game to run with improved performance as long as I stick to the integrated graphics, whereas the dedicated GPU appears to still show the same performance issues. SolarLune, I am going to try to profile this a little bit more later. Is there a hidden profiler I can show in your game?

Which CPU does your system have?

(Edited 3 times)

Most of that info is in other posts from this thread. I believe you will find your questions answered there. Rest is in my other thread. The problem is not the same.

It does not. Not close to identical. GPU's are Radeon Pro 450 2048 MB and Intel HD Graphics 530 1536 MB.

(Edited 1 time)

No problem. I would bet that more shaders have the same issues though, so try finding a pedantic shader compiler and fixing everywhere with unlike types. C and all it's derivatives (GLSL) are very strongly typed.

UPDATE: I can't seem to find the diffs right now but I had to edit the bgblur, blur, and crepescular ray 2d shaders.

It works on other systems, like mine, for example

I still had to repair the shaders to get them to compile, but when I get to the game it still shows the same issues. The game music runs smoothly but the textbox and movement and just about everything progresses once or twice per second.

As expected, it doesn't behave terribly well, but I can see that 92.7% of the cpu time is eaten by org.lwjgl.opengl.GL11.nglDrawElementsBO. It doesn't give me a call trace. The actual profiler never is able to attach to the game.

The game did not behave well (profiler never ran, or game crashed) with the Apple Profilers. I can try VisualVM but I'm not holding my breath.

Using the shader workaround mentioned in "Game fails to pass Title Screen", I am able to begin playing the game. Unfortunately the game runs poorly, but it does not appear to be a normal performance issue. The intro text crawls onto my screen one character at a time, as I would expect (since I played the Windows version) but the time between the characters appearing is about a second per character. Could be strange performance issue, but maybe you used a platform-specific timing or sleep function somewhere? Anything I can do to help provide more info for you? It seems to use 100% of one of my cores but doesn't do much with it.

Running around the Hub zone for a while was quite entertaining.

repeating this process lets me load into the game, which suddenly has very poor performance. Something with the shaders isn't liking my setup.

It looks like it doesn't like you dividing a float by an int. if you edit the shaders to properly handle types it will work. I extracted the first offending shader and used a float constructor to ensure it would work and it now does. Interestingly enough, the game seems to attempt to compile ALL shaders, whether the setting is on or off. I recommend you only compile the shaders that will be used in settings as a workaround until you get all that type stuff fixed.

Sadly the Java JVM doesn't like my OpenGL profiler/debugger so I am having trouble providing more detailed bug reports. Any suggestions?

Confirmed fixed for me

UPDATE: The game works for me on my Win10 x64 PC

Trying out the new build on my Mac:


WIth all default settings:

Loading JVM runtime library ...

Passing VM options ...

# -Xmx2G

Creating Java VM ...

Passing command line arguments ...

Loading JAR file ...

Invoking static com.solarlune.gearend.desktop.DesktopLauncher.main() function ...

Controllers: added manager for application, 1 managers active

AL lib: (WW) FreeDevice: (0x7f9b61117000) Deleting 4 Buffer(s)

Exception in thread "LWJGL Application" java.lang.RuntimeException: Shader compilation error in ScreenShader at:

ERROR: 0:33: '/' does not operate on 'float' and 'int'

ERROR: 0:34: '/' does not operate on 'float' and 'int'

ERROR: 0:35: '/' does not operate on 'float' and 'int'

ERROR: 0:36: '/' does not operate on 'float' and 'int'

< Vertex Shader >

bdx/shaders/2d/default.vert

< Fragment Shader >

bdx/shaders/2d/bgblur.frag

at com.nilunder.bdx.gl.ScreenShader.check(ScreenShader.java:37)

at com.nilunder.bdx.gl.ScreenShader.<init>(ScreenShader.java:30)

at com.nilunder.bdx.gl.ScreenShader.load(ScreenShader.java:45)

at com.nilunder.bdx.gl.ScreenShader.load(ScreenShader.java:49)

at com.solarlune.gearend.system.BGCamera.init(BGCamera.java:24)

at com.nilunder.bdx.Scene.initGameObject(Scene.java:564)

at com.nilunder.bdx.Scene.init(Scene.java:397)

at com.nilunder.bdx.Bdx$ArrayListScenes.add(Bdx.java:109)

at com.nilunder.bdx.Bdx$ArrayListScenes.add(Bdx.java:118)

at com.solarlune.gearend.system.SysCon.init(SysCon.java:216)

at com.nilunder.bdx.Scene.initGameObject(Scene.java:564)

at com.nilunder.bdx.Scene.init(Scene.java:397)

at com.nilunder.bdx.Bdx$ArrayListScenes.add(Bdx.java:102)

at com.nilunder.bdx.Bdx$ArrayListScenes.add(Bdx.java:113)

at com.solarlune.gearend.system.mainmenu.MainMenuCursor.main(MainMenuCursor.java:213)

at com.nilunder.bdx.Scene.runObjectLogic(Scene.java:851)

at com.nilunder.bdx.Scene.update(Scene.java:907)

at com.nilunder.bdx.Bdx.main(Bdx.java:289)

at com.solarlune.gearend.BdxApp.render(BdxApp.java:35)

at com.badlogic.gdx.backends.lwjgl.LwjglApplication.mainLoop(LwjglApplication.java:223)

at com.badlogic.gdx.backends.lwjgl.LwjglApplication$1.run(LwjglApplication.java:124)

Destroyed Java VM ...


Things I tried:

  • Lowered crepescular ray quality to low
  • turned crepescular rays off
  • turned bloom off
  • turned bg blur off
  • left scanlines off
  • turned vingette off
  • turned advanced lighting off
  • turned off screen shake
None of these changes resolved the problem. All of these were made under windowed mode. So I am trying Fullscreen setting next.

Still fails under fullscreen and is more painful to kill since the method of fullscreen used prevents switching from the window when locked up on macOS.


Conclusion: still failing.

Undertale for example doesn't quit immediately, but says "quitting" in the corner so that accidental escape key presses don't do anything.

The door you use a keycard on will not allow me to traverse back when I continue from a saved game.

The game fails to pass the title screen on MacOS Sierra 10.12.3.

This computer is a MacBook Pro (15-inch, Late 2016) with 2.6 GHz Intel Core i7 Processor 16 GB 2133 MHz LPDDR3 RAM.

The game was running while the GPU was set to dynamic selection. My two GPUs are:

Radeon Pro 450 2048 MB

Intel HD Graphics 530 1536 MB


If I demand that the GPU stay with the integrated graphics then stdout is:

Using bundle resource folder [1]: /Users/rainybyte/Downloads/gearend.app/Contents/Resources/[Gearend]

Loading JVM runtime library ...

Passing VM options ...

# -Xmx2G

Creating Java VM ...

Passing command line arguments ...

Loading JAR file ...

Invoking static com.solarlune.gearend.desktop.DesktopLauncher.main() function ...

Controllers: added manager for application, 1 managers active

AL lib: (WW) FreeDevice: (0x7f99d484ae00) Deleting 4 Buffer(s)

Exception in thread "LWJGL Application" java.lang.RuntimeException: Shader compilation error in ScreenShader at:

ERROR: 0:24: '*' does not operate on 'int' and 'float'

ERROR: 0:25: '*' does not operate on 'int' and 'float'

ERROR: 0:33: '/' does not operate on 'vec4' and 'int'

< Vertex Shader >

bdx/shaders/2d/default.vert

< Fragment Shader >

bdx/shaders/2d/bgblur.frag

at com.nilunder.bdx.gl.ScreenShader.check(ScreenShader.java:37)

at com.nilunder.bdx.gl.ScreenShader.<init>(ScreenShader.java:30)

at com.nilunder.bdx.gl.ScreenShader.load(ScreenShader.java:45)

at com.nilunder.bdx.gl.ScreenShader.load(ScreenShader.java:49)

at com.solarlune.gearend.system.BGCamera.init(BGCamera.java:24)

at com.nilunder.bdx.Scene.initGameObject(Scene.java:564)

at com.nilunder.bdx.Scene.init(Scene.java:397)

at com.nilunder.bdx.Bdx$ArrayListScenes.add(Bdx.java:109)

at com.nilunder.bdx.Bdx$ArrayListScenes.add(Bdx.java:118)

at com.solarlune.gearend.system.SysCon.init(SysCon.java:216)

at com.nilunder.bdx.Scene.initGameObject(Scene.java:564)

at com.nilunder.bdx.Scene.init(Scene.java:397)

at com.nilunder.bdx.Bdx$ArrayListScenes.add(Bdx.java:102)

at com.nilunder.bdx.Bdx$ArrayListScenes.add(Bdx.java:113)

at com.solarlune.gearend.system.mainmenu.MainMenuCursor.main(MainMenuCursor.java:213)

at com.nilunder.bdx.Scene.runObjectLogic(Scene.java:851)

at com.nilunder.bdx.Scene.update(Scene.java:907)

at com.nilunder.bdx.Bdx.main(Bdx.java:289)

at com.solarlune.gearend.BdxApp.render(BdxApp.java:35)

at com.badlogic.gdx.backends.lwjgl.LwjglApplication.mainLoop(LwjglApplication.java:223)

at com.badlogic.gdx.backends.lwjgl.LwjglApplication$1.run(LwjglApplication.java:124)

Destroyed Java VM ...

If I demand that the computer use the AMD dedicated GPU, the problem remains.


Confirmed a hang for clicking the "rebind input" button on macOS Sierra 10.12.3