![]() ![]() Ultimately, the error is coming from the code in PresentationLayer.AR.UIToolkit.ObjectClickFunction:Update(). Anything that results in OnDestroy or OnDisable being called on the GameObject will cause Rewired to shut down. This could be when the GameObject is destroyed or disabled, it could be when the InputManager component is disabled, when loading a new scene without Don't Destroy on Load enabled, when recompiling scripts, etc. What is deinitializing Rewired? It is deinitialized when OnDestroy or OnDisable is called by Unity. ![]() This function is calling player.GetButtonDown when Rewired has been deinitialized and is no longer running. ![]() PresentationLayer.AR.UIToolkit.ObjectClickFunction:Update() (at Assets/App/Scripts/PresentationLayer/AR/UIToolkit/ObjectClickFunction.cs:19) If you're using some other UI system, then you would have to make that get input from Rewired in order to use it.Ĭlick to expand.Look at the base of the call stack: This isn't the "Rewired" way of doing things - it's the Unity UI way of doing things, and it works for what you're describing. Your UI elements respond to Submit Action which can come from any controller, or they will also response to a touch/click. The RewiredStandaloneInputModule uses 4 Actions to navigate. If you're using Unity UI for your interface, it should basically be automatic. Unity's UI system has has a "Selectable" navigation system for keyboard and joystick input and uses data directly available from touches/clicks for touch based input and mouse input. You have more data at your disposal when dealing with touches than you do with Actions such as finger id, raycast targets, movement vectors, etc. ![]() The Action-based system works well for on-screen touch controls, but it does not work for uses such as you are describing. You most certainly don't want to have different Actions for every item you could possibly interact with. Touching items and interacting with them doesn't really fit with that paradigm. Rewired was not designed to be that and trying to make it be one is going to be messy. With a touch screen (mobile) you directly "touch" those elements you do not use an "input device" to select them and then performed an action on them.Ĭlick to expand.Rewired's Action-based structure does not lend itself to being a general touch interface input system. Not sure I'm clear enough here - but I guess my main interrogation is how to manage "touchable" elements in my game (buttons, sprites, UIs, whatever) in a "Rewired-style" ? With an external input device (keyboard, dpad etc.) those elements will be first selected using a navigation system and then an action could be performed on it. Should the whole inventory be a custom controller and every interactable items (open/close button, slots ) be an element of that custom controller ? Or should I handle touches on the inventory as usual and then override a custom controller value (like a touch controller) in order to fed that custom controller element value and then trigger an action ? It's kind of easy to imagine how the keyboard keys can be bind to actions in order to use that inventory, but I'm having more difficulty to get-it with a touch screen. The player should be able to open/close the inventory, select an inventory item (moving left-right) and consume that item. In that game there is an inventory with a fixed number of slot (no scrolling). So the game will be playable with a keyboard or a touch screen for instance. I'm currently working on a game prototype which will be available on desktop and on mobile. I'm using Rewired for the first time and I'm not quite sure that I do "get it" all the time. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |