Indie game storeFree gamesFun gamesHorror games
Game developmentAssetsComics
SalesBundles
Jobs
Tags

Gearend Demo

Explore an Abandoned Robotics Research Facility with an interchangeable Robot · By SolarLune

[SOLVED] Game fails to pass Title Screen

A topic by rainybyte created Apr 15, 2017 Views: 567 Replies: 11
Viewing posts 1 to 11

The game fails to pass the title screen on MacOS Sierra 10.12.3.

This computer is a MacBook Pro (15-inch, Late 2016) with 2.6 GHz Intel Core i7 Processor 16 GB 2133 MHz LPDDR3 RAM.

The game was running while the GPU was set to dynamic selection. My two GPUs are:

Radeon Pro 450 2048 MB

Intel HD Graphics 530 1536 MB


If I demand that the GPU stay with the integrated graphics then stdout is:

Using bundle resource folder [1]: /Users/rainybyte/Downloads/gearend.app/Contents/Resources/[Gearend]

Loading JVM runtime library ...

Passing VM options ...

# -Xmx2G

Creating Java VM ...

Passing command line arguments ...

Loading JAR file ...

Invoking static com.solarlune.gearend.desktop.DesktopLauncher.main() function ...

Controllers: added manager for application, 1 managers active

AL lib: (WW) FreeDevice: (0x7f99d484ae00) Deleting 4 Buffer(s)

Exception in thread "LWJGL Application" java.lang.RuntimeException: Shader compilation error in ScreenShader at:

ERROR: 0:24: '*' does not operate on 'int' and 'float'

ERROR: 0:25: '*' does not operate on 'int' and 'float'

ERROR: 0:33: '/' does not operate on 'vec4' and 'int'

< Vertex Shader >

bdx/shaders/2d/default.vert

< Fragment Shader >

bdx/shaders/2d/bgblur.frag

at com.nilunder.bdx.gl.ScreenShader.check(ScreenShader.java:37)

at com.nilunder.bdx.gl.ScreenShader.<init>(ScreenShader.java:30)

at com.nilunder.bdx.gl.ScreenShader.load(ScreenShader.java:45)

at com.nilunder.bdx.gl.ScreenShader.load(ScreenShader.java:49)

at com.solarlune.gearend.system.BGCamera.init(BGCamera.java:24)

at com.nilunder.bdx.Scene.initGameObject(Scene.java:564)

at com.nilunder.bdx.Scene.init(Scene.java:397)

at com.nilunder.bdx.Bdx$ArrayListScenes.add(Bdx.java:109)

at com.nilunder.bdx.Bdx$ArrayListScenes.add(Bdx.java:118)

at com.solarlune.gearend.system.SysCon.init(SysCon.java:216)

at com.nilunder.bdx.Scene.initGameObject(Scene.java:564)

at com.nilunder.bdx.Scene.init(Scene.java:397)

at com.nilunder.bdx.Bdx$ArrayListScenes.add(Bdx.java:102)

at com.nilunder.bdx.Bdx$ArrayListScenes.add(Bdx.java:113)

at com.solarlune.gearend.system.mainmenu.MainMenuCursor.main(MainMenuCursor.java:213)

at com.nilunder.bdx.Scene.runObjectLogic(Scene.java:851)

at com.nilunder.bdx.Scene.update(Scene.java:907)

at com.nilunder.bdx.Bdx.main(Bdx.java:289)

at com.solarlune.gearend.BdxApp.render(BdxApp.java:35)

at com.badlogic.gdx.backends.lwjgl.LwjglApplication.mainLoop(LwjglApplication.java:223)

at com.badlogic.gdx.backends.lwjgl.LwjglApplication$1.run(LwjglApplication.java:124)

Destroyed Java VM ...

If I demand that the computer use the AMD dedicated GPU, the problem remains.


Hey, thanks for the feedback. Updated with a new build. If anything further like this comes up, try disabling options and seeing all of them so I can catch them all in one further report. Thanks!

UPDATE: The game works for me on my Win10 x64 PC

Trying out the new build on my Mac:


WIth all default settings:

Loading JVM runtime library ...

Passing VM options ...

# -Xmx2G

Creating Java VM ...

Passing command line arguments ...

Loading JAR file ...

Invoking static com.solarlune.gearend.desktop.DesktopLauncher.main() function ...

Controllers: added manager for application, 1 managers active

AL lib: (WW) FreeDevice: (0x7f9b61117000) Deleting 4 Buffer(s)

Exception in thread "LWJGL Application" java.lang.RuntimeException: Shader compilation error in ScreenShader at:

ERROR: 0:33: '/' does not operate on 'float' and 'int'

ERROR: 0:34: '/' does not operate on 'float' and 'int'

ERROR: 0:35: '/' does not operate on 'float' and 'int'

ERROR: 0:36: '/' does not operate on 'float' and 'int'

< Vertex Shader >

bdx/shaders/2d/default.vert

< Fragment Shader >

bdx/shaders/2d/bgblur.frag

at com.nilunder.bdx.gl.ScreenShader.check(ScreenShader.java:37)

at com.nilunder.bdx.gl.ScreenShader.<init>(ScreenShader.java:30)

at com.nilunder.bdx.gl.ScreenShader.load(ScreenShader.java:45)

at com.nilunder.bdx.gl.ScreenShader.load(ScreenShader.java:49)

at com.solarlune.gearend.system.BGCamera.init(BGCamera.java:24)

at com.nilunder.bdx.Scene.initGameObject(Scene.java:564)

at com.nilunder.bdx.Scene.init(Scene.java:397)

at com.nilunder.bdx.Bdx$ArrayListScenes.add(Bdx.java:109)

at com.nilunder.bdx.Bdx$ArrayListScenes.add(Bdx.java:118)

at com.solarlune.gearend.system.SysCon.init(SysCon.java:216)

at com.nilunder.bdx.Scene.initGameObject(Scene.java:564)

at com.nilunder.bdx.Scene.init(Scene.java:397)

at com.nilunder.bdx.Bdx$ArrayListScenes.add(Bdx.java:102)

at com.nilunder.bdx.Bdx$ArrayListScenes.add(Bdx.java:113)

at com.solarlune.gearend.system.mainmenu.MainMenuCursor.main(MainMenuCursor.java:213)

at com.nilunder.bdx.Scene.runObjectLogic(Scene.java:851)

at com.nilunder.bdx.Scene.update(Scene.java:907)

at com.nilunder.bdx.Bdx.main(Bdx.java:289)

at com.solarlune.gearend.BdxApp.render(BdxApp.java:35)

at com.badlogic.gdx.backends.lwjgl.LwjglApplication.mainLoop(LwjglApplication.java:223)

at com.badlogic.gdx.backends.lwjgl.LwjglApplication$1.run(LwjglApplication.java:124)

Destroyed Java VM ...


Things I tried:

  • Lowered crepescular ray quality to low
  • turned crepescular rays off
  • turned bloom off
  • turned bg blur off
  • left scanlines off
  • turned vingette off
  • turned advanced lighting off
  • turned off screen shake
None of these changes resolved the problem. All of these were made under windowed mode. So I am trying Fullscreen setting next.

Still fails under fullscreen and is more painful to kill since the method of fullscreen used prevents switching from the window when locked up on macOS.


Conclusion: still failing.

Sadly the Java JVM doesn't like my OpenGL profiler/debugger so I am having trouble providing more detailed bug reports. Any suggestions?

(1 edit)

Hey Solar! Just letting you know that I'm having the same problems as rainybyte. I'm also on Mac OSX (El Capitan version 10.11.6).

It looks like it doesn't like you dividing a float by an int. if you edit the shaders to properly handle types it will work. I extracted the first offending shader and used a float constructor to ensure it would work and it now does. Interestingly enough, the game seems to attempt to compile ALL shaders, whether the setting is on or off. I recommend you only compile the shaders that will be used in settings as a workaround until you get all that type stuff fixed.

repeating this process lets me load into the game, which suddenly has very poor performance. Something with the shaders isn't liking my setup.

moved this topic to Bug Reports / Suggestions
(1 edit)

That makes sense. Even though Java is cross-platform, I'm betting the shaders compile (or are interpreted) slightly differently on each system.

i.e. float by integer division might be only a warning with a Windows compiler but error worthy in a Mac compiler.

Yo, thanks for the information so far, guys. I'm not certain what exactly is the issue with OSX, but I know I have observed it before on another Mac, so it's at least consistent. I'll have to look into it a bit more - maybe work on the internal profiler so I can get some info from testers. Thanks for being patient!

(1 edit)

No problem. I would bet that more shaders have the same issues though, so try finding a pedantic shader compiler and fixing everywhere with unlike types. C and all it's derivatives (GLSL) are very strongly typed.

UPDATE: I can't seem to find the diffs right now but I had to edit the bgblur, blur, and crepescular ray 2d shaders.

(3 edits) (+1)

So I incorporated a similar fix to rainybyte's - I can enter the game on Mac OSX. However, it's pretty laggy, harsh on GPU and music doesn't play (despite the title music playing, only sound effects work). I can't see the framerate, but I'd put it under 20fps. For reference, I can play games like Hearthstone, Shovel Knight, Braid and Undertale on this Mac at full fps.

Since the err.log doesn't really show me much, aside from there being a type error (trying to multiply an int by a float), I can't really help much until I look at this through a profiler (which I might try later).

Anyways, to get it to run, getting rid of the typing conflicts should help?

Edit: So I got around on using a profiler on it. The CPU usage seems to be in the right places (updating object logic and rendering). Instrumenting the game kills my fps, so I didn't play much, took me like 2 minutes to get through the first cutscene lol. For me, the profiler wasn't much help :/

(+1)

So this update should resolve this issue. The poor performance on OSX will probably still remain, but that can be continued in the correct thread for it. The game suddenly crashing because of shaders should hopefully be resolved.