Code Monkey home page Code Monkey logo

hackathon-sep20's Introduction

Project proposals for the September 2020 Aardvark Hackathon

If you'd like to propose a project for the hackathon, just open a new issue.

If you'd like to see what other projects people have proposed, take a look at the current proposals.

hackathon-sep20's People

Stargazers

 avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

hackathon-sep20's Issues

Spatial Microphone Toy

What would this gadget do?

This gadget would serve as the VR equivalent of a tape deck recorder toy, rendering captured sounds from underlying applications down to stereo audio snippets that represent the sound as perceived from the location of the microphone (user's hand) at the time of recording.

image

Reach goals could deal with snippet playback environment and the play that emerges from these 'sound pearls'. The ability to locate these pearls around a space and record that soundscape with the same microphone could be a fun form of emergent play (precedent: the spatial sound recording in AnimVR).

Who would use this gadget?

Sound designers/engineers, musicians, streamers

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

I'd need collaboration around everything that isn't sound design/ graphic design (I'm a programming novice with zero React experience). I don't even know enough to know what I don't know, in fact. I'd need enthusiastic and patient programming partners in order to attempt this project.

What will be the toughest part of building this gadget?

Possibly the rendering out of spatially specific stereo (or stereo+) audio at the hand location rather than at the head position, potentially necessitating the toggling between those audio vantage points or the simultaneous tracking of them? The matter of playback and UI for managing the experience of creating these 'sound pearls' could become pretty involved pretty quickly.

Transition input between virtual layers using the thumbpad (Frank)

What would this gadget do?

This application disables input to VR, AR, or neither, depending on your needs.

Using AR layers in VR carries the risk of making the layers of VR-AR interfere with each other by the fact that inputs are designed in virtual worlds with the assumption that they are the only focus of the user.

The Index Controller's thumbpad is rarely used in games and even when it is, not all of its inputs are utilized. So the thumbpad could be used to switch between VR input, AR input, or both. The application would disable all control of AR layers at launch. The user would slide the thumbpad up to stop disabling control of the AR layer, then again to start to disable input to the VR layer. This would be cued visually by hands rendering at half opacity for control of both layers, and fully opacity for controlling only AR. A haptic response similar to the one in Half Life Alyx when she gets her grabbity gloves would be used to make it clear that input has changed to make sure the user notices. These hands would also solve the issue of controller/hand occlusion if running Aardvark as passthrough AR.

This also mitigates occlusion issues by having something on the gadget layer that can render over gadgets since real and in game ones won't.

Who would use this gadget?

I feel like this could be of universal use and simplify the design of many gadgets by making a specific way of "locking" a UI element, for example, unnecessary, as well as avoiding accidental input in gadgets. If it could perform this same function in steamVR, blocking input of discrete applications, it could be a good way of regulating input when using other dual layer programs like metachromium too.

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

I am not experienced with coding, I’m just trying to provide the concept for consideration.

What will be the toughest part of building this gadget?

The most difficult function to design in this might be disabling input to the VR layer. Dashboard overlays rendering over one another block input, like the hand menu in fpsVR over the dashboard overlay, but it is unclear if Aardvark could do that in normal SteamVR software.

Twitch integration: Channel point rewards to trigger arbitrary events for the VR streamer (affiliate)

What would this gadget do?

Allow a Twitch viewer to use the built in channel point rewards to cause arbitrary events to happen for the VR streamer.
image

Who would use this gadget?

Twitch streamers that want to give their viewership a means of interacting with the stream in a tangible way that is also well integrated into the Twitch platform.

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

As I'm used to work on my side projects alone I don't really expect anyone else to jump in on this, I'm also in Europe so it might affect who could help out if we need to communicate. Myself I will probably manage the Twitch integration, as I'm a backend person at my day job. I have used TypeScript but never looked at React before this, so React knowledge would be nice I guess.

What will be the toughest part of building this gadget?

It starts with the integration against the Twitch PubSub API, which appears to be an active WebSockets connection, which I have experience working with. It needs an OAuth access token with specific scopes for the API access, so will also need an OAuth authentication before anything can work, I'm hopeful we can use the OAuth2 Device Flow for this, which is available for Twitch on consoles today, but I'm not sure that is open for anyone. The authentication will be something the project hinges at, and might be something I will have to explore beforehand, if there's time.

Other than that, React is an unknown to me as of yet, but from what I've seen on the Slack server it should be fairly straight forward. I imagine setting up a few example events that can be triggered, so hopefully not too complicated, but it could be nice to add a configuration gadget that would help with setting it up but I imagine most of that to be done on the desktop.

Side notes

Something that pushed me to submit this was hitting Affiliate on Twitch, not a huge deal, but that unlocks a few platform features of which Channel Points is one. As it is I have infinite channel points for my own channel, so I can most likely test this out without hassle. It is possible to define a range of custom rewards for channel points which is why it's a suitable function to use.

An example project that examples very well

What would this gadget do?

It would serve as the best possible example to all of the other gadgets.

Who would use this gadget?

People who like to submit project idea to hackathons

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

Designers and 3D modelers

What will be the toughest part of building this gadget?

Coming up with a good example to example about.

Playing Cards

What would this gadget do?

image

A deck of cards that shuffled cards can be pulled out.
All cards can be pulled back into the deck
The deck can be shuffled

Who would use this gadget?

Anyone who likes to play card games. Could be good during idle time or as a full card game experience with friends.

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

Interaction design and coding

What will be the toughest part of building this gadget?

Grabbing and maneuvering cards will be challenging to feel good as you don't have the fine motor control you'd normally have while playing with cards.

Macropad VR

What would this gadget do?

This gadget creates a 3D UI object that serves as a macropad, allowing a user to perform various actions to streamline workflow. Integration into programs like Photoshop, OBS, standard keyboard shortcuts, twitch, allows the gadget to supplement or replace dedicated macro hardware and makes it more accessible to more people. The code would be built to be easy to tweak the layout, add buttons and knobs, and change their function (press, hold, twisting, etc). In VR novel functions not possible in real life would make it an attractive use case like tooltips when hovering over a button, pulling buttons out to use frequently in a session, and so on.

Who would use this gadget?

Macrodecks are expensive and only partly customizable. In VR the configuration, shape, scope, and size of macrodecks can be far more flexible and cost little to nothing. While this could be used while standing by a streamer, the ideal use case would be breaking new ground by making seated work in VR much more flexible in excel, creative software, and more. This could pair well with hand tracking or passthrough AR, and any desktop mirror like Desktop+. This would make work in VR on PCVR much more capable and less constricting than any competitors, and novel functionality could make it an actually viable option.

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

I am not experienced with coding, I’m just trying to provide the concept for consideration.

What will be the toughest part of building this gadget?

To use real macros, it would need to interface with existing APIs. Making the UI in such a way that accidental inputs aren’t possible without slowing down function significantly is essential, as well as making the layout easily customizable in both the number of buttons/knobs/switches and the shape of the tool.

Future versions of this would need to be more complicated on the back end to incorporate things like autohotkey to gain much more functionality.

TransceiVR: Bridging Asymmetrical Communication Between External and VR Users

What would this gadget do?

Virtual Reality (VR) users often need to work with other users, who observe them outside of VR using an external display. Communication between them is difficult; the VR user cannot see the external user’s gestures, and the external user cannot see VR scene elements outside of the VR user’s view.

The gadget would allow external users to explore the VR scene spatially, to annotate in 3D in the VR scene, and place them at correct depths, and to have shared discussions via a shared static virtual display.

I had previously developed a version of this using IVROVerlays, but it is capable of rendering only 2D elements in the scene. The video and a research paper on the system can be found at Transceivr - https://www.tkbala.com/transceivr

Who would use this gadget?

Users who carry out collaborative tasks in VR, in which one or more users are outside VR (spectators)

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

I would need members who have experience with ReactJS, and possibly members who have experience developing multi-user networked applications.

What will be the toughest part of building this gadget?

To allow rendering of 3D annotations at the right position in the VR scene.

A framework for implementing universal Hand Tracking (Grabbity)

What would this gadget do?

This gadget would add three forms of hand tracking to Aardvark. The first would be the Vive hand tracking SDK tracking, adding support to the Vive Pro, Index, and Vive. The second is through the Leap Motion api, allowing support on any headset with the added leap motion hardware. The third is through the Valve Index’s finger tracking, combining the position information of the controller and the finger tracked data (as well as the built in “pinch motion”); forming a hand tracking emulation that does not use the buttons of the index controllers.

Ideally the gadget would feed the data from any of these into a common hand tracking functionality, so each gadget does not have to support them individually, and so new methods of hand tracking can be configured to feed in their data. This also provides a model for other developers to create universal hand tracking.

These hands rendered from the tracking would allow users to interact with UI with either pressing with an index finger or pinching to create an laser, and the hands would be dynamic objects that can have collision with other gadgets that choose to enable it.

mrtk_button_main

04_interactionfundamentals

Who would use this gadget?

Developers who intend to use Aardvark as a development platform for AR will have some reluctance for the lack of hand tracking. If given the option, many devs would be more receptive to Aardvark. Hands would be rendered over the users tracked hands, which would also fix occlusion issue. Occlusion zones rendering over the tracked hands is a more advanced method that could hook into #9. This solve one of the main issues with making Aardvark a functionally feature complete AR experience.

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

I am not experienced with coding or 3D modelling, I’m just trying to provide the concept for consideration.

What will be the toughest part of building this gadget?

The toughest part of building this gadget would be incorporating all three methods and creating a common funnel hand system.

General OAuth client (OpenID Connect)

What would this gadget do?

A way to perform OAuth authentications with Aardvark.

Who would use this gadget?

Would be neat if it could become a component others could use to integrate their own gadget with any online service that supports it, tailoring it for OpenID Connect which is widely adopted.

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

React knowledge would be useful, as I lack that.

What will be the toughest part of building this gadget?

Just figuring out how to do this in an accessible and intuitive way, that comes before implementation.

Side note, closed my other projects as this one kind of took over. Both of those would use this as a component though, so there's that.

An ambitious AR/VR multiplayer game (Ricochet AR)

Almost certainly not a realistic purposal

What would this gadget do?

This gadget is a possible test for a more ambitious game, and can be played in AR or VR. Players are non local and the game is mapped to the play space of the player with a smaller room, then played entirely in roomscale. It takes up a large area, either the full arena (3M by 4M) or just the portion needed by each individual player (3M by 2M).

Players are restricted to their side with frisbees in their hands that they throw as ranged weapons. Frisbees bounce off of walls or virtual obstacles three times and then despawn and return to the player’s hands, and if they hit a player the player loses 50% of their health. Players can also place a rectangular object on the border between the two players and trace it in to allow for it to provide collusion for the frisbees, showing up as a virtual object for the other player unless they both place an object in the same relative place. Once a player is knocked out the round ends, and users can play to best of 3, 5, or 7.

Who would use this gadget?

This would be an aggressively ambitious showcase of how Aardvark gadgets can communicate with low latency and provide high level gameplay in an overlay gadget.

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

I am not experienced with coding or 3D modelling, I’m just trying to provide the concept for consideration.

What will be the toughest part of building this gadget?

Making an entire game in Aardvark’s systems appears to the hardest things you could do. Beyond the game itself, it needs reliable collision, to remain synchronized between the players, 3D models and avatars, and room information.

Screensharing

What would this gadget do?

This gadget mirrors the user’s display and makes it accessible to other users. So a user can have multiple additional copies of their entire monitor, or just an individual window, to share with other users.

Who would use this gadget?

This gadget is a lean solution to screensharing for work, content consumption, or small group streaming in any context.

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

I am not experienced with coding, I’m just trying to provide the concept for consideration.

What will be the toughest part of building this gadget?

Making the screen capture reliable and consistent over whatever social integration Aardvark is using (Pluto, Hubs, etc).

Multi-User Primitive 3D Object ToolBox

What would this gadget do?

image

It is one or multiple Gadgets that are basic 3D primitives that can be moved and scaled. Bonus feature could be uploading custom GLBs to be added to toolbox

Who would use this gadget?

Anyone looking to communicate ideas and sketch out 3D concepts:

  • Game Designers
  • Architects
  • Teachers
  • Business Meetings

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

Programming and Interaction Design

What will be the toughest part of building this gadget?

Managing all of the objects and making it easy to save/load sets of objects

Questions:

  • Inter-Gadget communication: Could there be another app, like a color picker that could be used to change the color of the objects?
  • What is the best way to save and load the objects? Is this a single aardvark gadget or actually a bunch of them?
  • What kind of perf does Aardvark have when launching large numbers of gadgets?

Generating a flat surface

What would this gadget do?

This gadget allows a user to create a flat surface for other gadgets to use for collision and placement. It’s a very lite version of #9, just creating a flat surface meant to be used as a table, without occlusion zones or an explicitly listed context. Players can change the opacity of the surface.

Who would use this gadget?

This gadget would primarily be for other gadgets to take advantage of, making it simple for gadgets to assume a flat surface that they share with others. For users, it allows them to use gadgets on top of VR or AR surfaces. This greatly increases the viability of casual gadgets like card games and objects that don’t anchor to the player’s body.

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

I am not experienced with coding I’m just trying to provide the concept for consideration.

What will be the toughest part of building this gadget?

Creating a way for the surface to provide collision for other gadgets seems like the hardest part.

Bookshelf Launcher UI (Vault)

What would this gadget do?

This gadget allows a user to import their steam library in order to create a 3D interactable bookshelf for their VR software. Each game would be represented by a book on the shelf with its art on the cover and its name on the spine. Users would also be able to import game box art that matches a premade template, cover spine and back, for each game. Users would also be able to arrange games on the shelf and save that layout and configuration. Users would be able to import non-Steam games and programs and add “books” for them as well. Users open the “book” to launch the game.

Additional functionality could add a bookmark object to each game “book” that can be pulled out to set custom settings that would be enabled when starting that game. These would include resolution modifiers, refresh rate, motion smoothing settings, and volume.

Who would use this gadget?

SteamVR lacks any spatial UI for arranging or playing their games. This allows users to create a personalized spatial version of their library, and sets a model for future means of laying out their VR content. This could work with #29 to place a bookshelf in the AR passthrough of the user’s room.

Importing non steam VR games into SteamVR has not worked for almost a year since they overhauled the steamUI, so this allows the community to work around that and other UI issues.

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

I am not experienced with coding or 3D modelling, I’m just trying to provide the concept for consideration.

What will be the toughest part of building this gadget?

The toughest parts of building this gadget would be importing library data from Steam and creating a usable bookshelf where users can easily browse and remove books from the shelf.

A game to show off basic physics in Aardvark (CAKE)

What would this gadget do?

This gadget is a simple game to show off physics in Aardvark, while accounting for potential issues with netcode or stuttering that might emerge.

In CAKE, users will have a set of 25 pieces of pastry that they will have to stack on a solid base. To stack, users line up pieces over ones underneath. Pieces will clip through each other until a player lets go and physics takes effect pulling the piece straight down, so the momentum of the player’s hand is irrelevant. The game has several modes, like all players playing pieces on the same stack one by one, attempting to avoid causing the stack to fall; players building as fast as they can until they have stacked all 25 pieces or only one player hasn’t fallen over; or players being offered two pieces to use, the one they don’t select is the one the next player has to use, and then it reverse; tallest stack before a any player drops two pieces.

The gadget uses either #30 or its own flat surface to create a table for users to play on.

This game is inspired by the dexterity game “Junk Art.”
Junk-Art-4

Who would use this gadget?

This gadget is a simple social game meant to showcase both interoperability of users’ gadgets and physics in Aardvark.

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

I am not experienced with coding or 3D modelling, I’m just trying to provide the concept for consideration.

What will be the toughest part of building this gadget?

Adding physics to the gadget and scripting the game modes would be the toughest parts of making this gadget.

Babycam & Home Security video pass-through

What would this gadget do?
Pass through video from Ring and other home camera systems and baby cameras to the VR player

Who would use this gadget?
Humans - anyone in VR who wants to see the real world at a glance, without leaving VR

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?
As an art/content guy - I have none of the skills required for this myself. would love to see more capable hands lead this. would need people with experience connecting video applications - C#, C++, Python all appear to be used, would also be awesome for anyone who has these systems on the same network as their VR equipment to test. On the API side, there are some crowd source API projects and a ton of alphabet soup.

https://github.com/dgreif/ring
https://pypi.org/project/ring-doorbell/
https://developers.nest.com/reference/api-camera
https://www.twilio.com/blog/smart-baby-monitor-python-raspberry-pi-twilio-sms-peripheral-sensors
https://www.lollipop.camera/

What will be the toughest part of building this gadget?
Making the application useful for a wide variety of video inputs - it appears that there are a variety of DIY APIs for Ring and other camera setups; then, once video is available, pushing a notification to the user on system events - motion or sound on a baby monitor, motion detection at the front door, etc., activating the window overlay in VR.

Passthrough Portals (Chells)

What would this gadget do?

This gadget would allow the user to create real world camera portals, and to create virtual world rendering portals. These would allow the user to define a 3D space in their play area, such as a keyboard, a glass of water, a door, etc and have that window remain persistent in space and provide a view of that object. Obvious use cases would be using a mouse and keyboard while in VR, but you could also make a window for a friend to see them while you play. The gadget doesn’t need to actually see your real world environment, it just passes through on the cross section of the room anchored 3D object. A person wearing a Vive tracker on their head could also be passed through the same way, as a passthrough zone relative to the tracker.

It could also allow the user to make a window out of passthrough back into the game, allowing existing VR software and games to act as AR software by removing nonessential elements. There would be a toggleable blue shimmering backlight on portals of AR into VR, and an orange one on VR into AR.

While researching this, I found an attempt in the form of a SteamVR overlay that uses 2D planes using the Vive camera: https://store.steampowered.com/app/864790/FragmentVR/

Who would use this gadget?

This gadget would be ideal for people working in VR, people using VR socially locally, users who want to have a snack on a table next to them (see the object to grab it and then using touch for the rest), or anything else. Even just for keyboard and mouse this may be a better solution than any competitor as it works with any hardware, includes the mouse, and can be useful for lots other use cases like leaving a door open in the world to talk to someone.

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

I actually could make the artwork for the cards, but I am not a programmer or experienced with 3D modelling.

What will be the toughest part of building this gadget?

The toughest part of building this gadget would be to have the 3D passthough running on only part of the screen, and on a 3D object at that. The steamVR overlay I linked does not work on the Index, which may be because it was using the webcam functionality of the index camera, and cannot combine two images or incorporate the 3D passthrough.

Multi-User Ball

What would this gadget do?

image

It is a ball, that can be grabbed, thrown, and caught by people.

Who would use this gadget?

Anyone could use ball. It is a great idle activity. Could could used during loading screens or casually passed back and forth during a conversation.

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

Most of the skills and work required are in the networked physics.

What will be the toughest part of building this gadget?

Getting the networked interactions to feel good is the most challenging aspect.

Question:

  • Does/Could Aardvark support collisions with other gadgets? If this was possible, the ball could bounce off other gadgets and other simple gadgets could be created like a "wall" that could be specifically used for creating a place for the ball to bounce off of.

Handheld Camera for Steam Screenshots

Screenshots are a great way to share an experience with steam friends after the fact, but in VR they're often at the mercy of in-game options or a simple PoV capture. The solution could be a handheld, Aardvark-powered camera.

My assumption is that Aardvark can access pixels in a given app it's overlaying, but if that's not the case, then it could just act as a mask/viewfinder that activates the standard screenshot tool along with a simple gallery for saving/sharing.

Meta-Matchmaking (Garry)

What would this gadget do?

This gadget would allow for “meta-matchmaking,” where users can note their interest in playing multiplayer VR games and be matched up with other users who want to play the same game. Multiplayer VR games that are in the user’s steam library would all be listed and the user selects the games they are interested in playing. When a group of 2, 4, or 10 users have marked the same games as ones’s they are interested in playing, the user’s in question will be notified and invited to launch the game though a launch UI within the gadget so they can play together. This will require a server or similar method to function.

The gadget will also list the number of users playing any game in the list currently, requiring the gadget to be aware of what game every user is playing and communicating it if it is on the list. Users can hide or show their status of playing the game if they want.

Who would use this gadget?

Users who enjoy multiplayer games would use this gadget to find parties to play with in games that lack a sustainable user base. At scale, it would help VR games that are multiplayer focused build and maintain a playerbase.

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

I am not experienced with coding, 3D modelling, or setting up servers, I’m just trying to provide the concept for consideration.

What will be the toughest part of building this gadget?

This gadget would require setting up the necessary servers and programming software to reliably keep track of and bring users together.

A Body Placement Framework for Gadgets (HEV)

What would this gadget do?

This gadget creates a body of 3D space relative to the user’s controllers so that gadgets can be arranged on, or relative to, their forearms and other body parts. Rather than gadgets individually attempting to find a place to put themselves relative to the user, they are assigned a place by the widget after the existing ones. The user can enable a config mode, allowing them to remove their gadgets and rearrangement them when needed.

Gadgets can occupy discrete zones that are laid out in grids (3 zones on the x axis, 6 on the y axis 1 on the z axis; for example). One grid along the underside of each forearm, one offset from the users forearm, and one on the top of a users forearm, potentially other areas as well like on the upper arm, chest, or legs. Gadgets can have a small, medium, or large size that determine how many zones they occupy. This allows the user to place them on their person while keeping them aligned and streamlining the process of many gadgets occupying the same space.

A huge feature would the ability for users to swipe/scroll through the widgets in one of these grids, allowing users to use far more widgets than would be possible with an organization framework like this, and make leaner specific use case UI widgets more viable. This would also make it easy for users to reduce the size of this gadget’s UI if the game being played also uses the wrist, forearm, etc for its own UI.

Untitled-1

Who would use this gadget?

Gadget developers would have to worry far less about where to place their widgets and placement can be offloaded to this gadget so it would help them. Users would have an easier time managing their gadgets and using simple gadgets that aren’t spatial or have very limited functionality (alarm clocks, pause play music, etc).

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

I am not experienced with coding UI design, I’m just trying to provide the concept for consideration.

What will be the toughest part of building this gadget?

The hardest part of making this gadget is likely creating a framework that is easy for gadgets to operate on top of.

Explicable Language Interpreter (Eli)

What would this gadget do?

This gadget would attempt to create subtitles for what the user is saying, feed them into an online translation api, then display the result in front of them so other individual’s gadgets can see them.

Displaying in front of the user from their gadget rather than sent straight to another’s gadget may be good for privacy, but would only be visible from one angle. Alternatively, the app could have an approval function, and then when you speak, it would appear in others’ view like closed captioning in scripted VR games right now. This method would also allow translation to be done on the receivers gadget, allowing a group speaking and listening in more than two languages to communicate.

Who would use this gadget?

Efficacy of digital translation, especially when also dependent on speech to text, is not ideal, so this couldn’t really be used for any serious functions, but as a last resort or in multiplayer games, this could be useful to most people.

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

I am not experienced with coding, I’m just trying to provide the concept for consideration.

What will be the toughest part of building this gadget?

The app would need to connect to general translation apis like Google Translate, which is a common thing already, but ideally it would alert the user if they’re speaking too quickly, maybe by showing them the english input as it is turned into text, and let others maybe click a “don’t understand” indicator for when things don’t work as intended.

Virtual Computer with desktop view, keyboard, and mouse (LGR)

What would this gadget do?

This gadget involves three components: a desktop view that supports multiple monitors with control over size, position, curvature, and opacity as well as task switching and swiping like a touch screen; a toggleable keyboard with control over size, position, curvature, and opacity that matches a full windows keyboard with VR essential functions like "alt-tab" as large buttons; and a mouse that locks itself into a flat plane when in use to allow full functionality with adjustable sensitivity. These components can be run independently with the others disabled or hidden.

If it is possible to allow the monitor to be moved with the system button on the headset combined with gaze movement, that would be ideal for the user to always have the means to tweak their set up.

Desktop+ has a feature complete keyboard, represented as a 2D object. This gadget has the opportunity to move away from a locked rectangular shape in order to fan out the function keys, enlarge the spacebar, etc.
Capture

One example of such a mouse implementation is in Vacation Simulator:
vacation-simulator-03-640x639

Who would use this gadget?

This gadget provides three components for other gadgets to make use of. Gadgets like #25 or #2 can provide the keyboard and mouse to go with this gadget’s monitor. The keyboard of this gadget could serve as the best keyboard input system for other widgets in sending messages or configuration. And the mouse could be used with the default steamVR desktop view or overlays, as well as for games or drawing gadgets. This is a model for gadgets to keep their features compartmentalized in order to allow users to mix and match solutions, and to allow other gadgets to borrow functionality.

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

I am not experienced with coding or 3D modelling, I’m just trying to provide the concept for consideration.

What will be the toughest part of building this gadget?

The toughest part of building this gadget will likely be making the desktop view a high performance and bug free experience.

HOTAS Controls with GUI (CAKE)

What would this gadget do?

Using these two implementations as conceptual models, this gadget creates a 3D throttle and stick within aardvark that can be used as HOTAS inputs. Users would select a profile or a rigged model they want to use, then place them around their seated position and anchor them. This allows games with HMD support but no VR control schemes to be played with VR controllers. Users can save control schemes, both the assigned inputs and placement of 3D UI. Adding only macros is a separate implementation, showed in a separate project I listed before.

https://store.steampowered.com/app/1355840/animARide/

https://github.com/dantman/elite-vr-cockpit

Who would use this gadget?

VR as a display method for games like simulators and cockpit games is common, but in VR controls are not. And while there exist many ways to run non VR games in VR, it’s rare to the point of uniqueness to be able to control those games spatially. Many many users would be excited to both develop implementations for and to play their favorite games with 3D rendered VR controls. This would also need a small macropad as well for controls that cannot be fit onto the stick and throttle.

This also creates a model for GUI elements as a way of adding controls, macros, or other elements to non VR games. By mapping keyboard controls, gamepad inputs, accessory inputs, or macros to 3D UI elements or controls, many games could be significantly improved in VR including HMD only games, non VR games running in VorpX, emulated games in DolphinVR, or non VR games simply being played on a VR screen. Making saving and sharing of control profiles simple is essential to making this accessible for a lot of use cases.

In the long term, clever ways to fake direct interactions with tools in a game could increase the polish of these kinds of experiences. Eventually it could also be used to augment workflows by spatializing complex actions, that’s outside the immediate scope of this.

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

I am not experienced with coding or 3D modelling, I’m just trying to provide the concept for consideration.

What will be the toughest part of building this gadget?

The toughest part of building this gadget is likely making a 3D interface that can be controlled without being anchored in real space like a real HOTAS (although accessories to do this exist) and can be used quickly and reliably in games.

A framework to add intelligent occlusion and context awareness to Aardvark (Aperture)

What would this gadget do?

This gadget would allow Aardvark gadgets to have occlusion, and basic contextual awareness, in a roomscale or AR scene. This would be done in a “simple” or “advanced” mode based on the users needs. In the simple mode the user can place occlusion boxes, drawing them in VR through vector shapes (spheres, rectangles, triangles) and placing them over objects in the roomscale environment. Use of multiple shapes that clip through each other for a single object allows for higher fidelity. These boxes would then be used by other Aardvark apps to know to occlude anything rendering behind the occlusion box/object. This is a roomscale function however and meant for testing or when the user stops for a short period and wants to add occlusion to an object in a program. It would also define the floor and prevent gadgets from rendering beneath it, perhaps even being relied on to prevent gadgets from moving below the floor at all, and the same with the ceiling if necessary. A laser with an interaction point, extended and retracted with the joystick, would allow the user to create a zone from a distance or mark walls out of the roomscale boundaries. These boxes would be at half opacity until finished, and then activated in order to take effect and become invisible, occluding Aardvark widgets and passing though the VR layer.

The advanced mode would allow the user to not just place shaped occlusion boxes, but would allow them to recreate the scene they are in within aardvark, “tracing” the environment by placing walls, a floor and ceiling, as well as objects such as windows, picture frames, chairs, and more complex shapes or freedrawn occlusion zones. This would allow for more advanced occlusion.

Advanced mode would also allow occlusion boxes to be marked with a context like TV, window, door, couch, or chair. With this it can feed gadgets information they could use to provide advanced AR functionality including skyboxes out a window, things walking in through the door, movies playing on televisions, pictures in the frames, awareness of where the chair in the room is, and so on. Developers would be able to add new contexts in their gadgets, allowing users to label an occlusion zone with a that context, allowing them to expand the scope of this use case. Other gadgets can also use these occlusion zones for collision for objects.

This gadget thus gives other gadgets a foundation for handling occlusion and environmental awareness, essential components of AR, without machine vision.

Both of these modes assume a way to center the player’s “tracing” reliably in an environment and so should allow for an easy marker to be made to anchor and align a set of occlusion zones.

This gadget would also work over 3D passthough in order to add occlusion and context to the users room which is already obviously roomscale. Occlusion cones would also be able to be used for a related function by another gadget: defining a spatial zone of 3D camera passthough in order to bring a real world object into VR, or to do the reverse and add a virtual object into the AR passthrough. This would be integrating with "Chells," and gadget with that purpose.

The gadget could communicate with someone else’s gadget in order to render for them an approximation of the scene/room the primary user is in. Occlusion zones could be relied on as a base in order for a game to allow a player to “trace in” a table or other surface to use in the game. Photogrammetry, pre rendered, to create a more full telepresence, is outside the scope of this gadget.

After some testing in both AR and VR, Aardvark gadgets appear to always render over VR programs, when it comes to the users in game hands, this is a major problem and can make interacting with gadgets harder and even headache inducing. Occlusion zones built around player hands may be an important part of using Aardvark comfortably. This may require custom profiles for games, covering the range of motion and orientation of the player's hands in that software, as well as a generic profile.

Who would use this gadget?

This gadget adds advanced AR functionalities that are usually the product of machine vision to aardvark. This eliminates a great deal of the gap that aardvark has from not having access to machine vision in either VR or AR. Developers intending to design for AR principles and the most likely AR technology who expect to be able to design around the aspects of AR provided by machine vision now are able to utilize occlusion and context awareness in their design and prototypes, as they would expect. Especially when used for 3D passthough, this makes Aardvark an excellent choice and PCVR well beyond most competition..

It also allows consumers to make use of these same tools for AR games or utilities, or in fully roomscale VR games. They could use in social functions or games with others, but for them it would be a standard VR object.

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

I am not experienced with coding, I’m just trying to provide the concept for consideration.

What will be the toughest part of building this gadget?

Creating the foundation for this gadget to feed other gadgets this information in a manner they can understand in order to be occluded properly, or gain awareness of context, seems to be the most difficult aspect.

Aardvark Evening Filter Sunglasses ("Gordons")

What would this gadget do?

This gadget creates a pair of sunglasses. By putting them on, the glasses overlay a slightly opaque reddish brown filter onto the user’s entire view, similar to a blue light filtering program like F.lux.

This is a test case for more advanced functionality like applying image filters, and the concept of AR glasses inside of VR as a way of enabling AR layers like passthrough windows, chat UI, or a debug menu function where gadgets report errors over themselves. Glasses like this could be an ideal way to manage different layers of UI and be indicative of the usefulness of physical objects as a form of UI for Aardvark. The player can remove them by placing their hand next to their head and holding their grip for several seconds with a little haptic response conveying that they’re holding the glasses.

Future revisions could apply different or more complex filters, with machine vision ones being the highest level of functionality. However, the gadget could integrate with others, like an avatar gadget, to allow “fake” machine vision, so an effect would be applied to another user’s aardvark avatar instead of any in game avatar or passthough view.

Who would use this gadget?

Average users would make use of this gadget as a simple and intuitive way to enable a “blue light filter” effect. In the future it could allow for “filters” on other aardvark gadgets, reveal UI layers, or allow developers to have an intuitive way to enable a visual debug mode.

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

I am not experienced with coding, I’m just trying to provide the concept for consideration.

What will be the toughest part of building this gadget?

Cleanly applying the filter at the right level of opacity without any artifacting may require a lot of trial and error. Building on this without any access to the machine vision that drives snapchat filters and other filters would be much harder if even possible.

Build a bridge between Mozilla Hubs and Aardvark

What would this gadget do?

It would allow Aardvark users who are in a hub together to see and interact with each other's gadgets. Aardvark already supports this. What is missing is the code that figures out who a user is in a hub with and shares that with Aardvark.

Who would use this gadget?

Anybody using both hubs and Aardvark. It would also make it more likely that users of one would start using the other.

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

I may not have time during the hackathon to work on this because I'll be helping all the teams.

Mainly this project needs programmers. Javascript savvy folks to figure out how to tease this information out of Hubs. Server-side folks to bring up a quick and dirty service that shares the gadget information.

What will be the toughest part of building this gadget?

Getting Hubs to share this information. That could be accomplished with actual hubs client code changes (since it's open-source), or by wrapping hubs in an IFrame and reaching directly into the Javascript state of the hubs IFrame to pull out what we need.

Passthrough at launch with onboarding GUI (Pocket Universe)

What would this gadget do?

This gadget would launch with SteamVR and enable 3D passthrough. It would also render a large spigot at their room center. Users can spin the circular lever on it in order to turn off passthrough and optionally open the SteamVR dashboard. They can also press a button on the top of it in order to leave the headset in passthrough mode. Both methods make the spigot disappear; it also disappears if the system button is pressed.

The gadget could also allow the passthrough to come on whenever players close a game in SteamVR.

gate-spigots

Who would use this gadget?

This gadget serves as a simple form of onboarding newer users, allowing users to pick up their controllers, or clear their playspace while in headset. It removes a layer of friction from the experience of starting up VR.

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

I am not experienced with coding or 3D modelling, I’m just trying to provide the concept for consideration.

What will be the toughest part of building this gadget?

The most difficult part of this would be making and animating the model and scripting it to start and stop passthrough.

VR Locomotion Accessibility Gadget

What would this gadget do?

image

This gadget would have several different options:

  • Reduce FOV (basically putting on blinders)
  • VR Nose (Research has shown that having a "nose" in VR can help)
  • Cockpit/World Locked visual grounding (Reducing the amount of the world that moves at any moment can help)

Who would use this gadget?

Anyone like me who loves VR but still is incredibly sensitive to sim-sickness due to the vestibular mismatch that happens when using certain kinds of locomotion in games.

Currently, these accessibility options are up to each individual VR developer to make and support. This means there are some games that don't have a full range of options, leaving some users/players out of luck when it comes to having a comfortable experience.

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

Basic Programming and interaction design

What will be the toughest part of building this gadget?

Providing easy access to the configuration tools while using other experiences.

Questions:

  • Could we make it easy to map certain actions to controller input? If so, we could have the FOV limit when pressing forward on the joystick in a game that has you doing smooth movement
  • Can we have Aardvark detect what scene app is running so settings can be configured to specific applications and games?

Avatar Creation Tool (Personality Constructs)

What would this gadget do?

This gadget allows users to make their own avatars. The construction system is offering the user basic 3D shapes (rods, cones, spheres) that they can use to build an avatar’s head and torso. Shapes clip through each other, allowing users to use them to make complex avatars if they want. Shapes have 3 characteristics: size, color, and the specific shape itself. The gadget only makes an output for other gadgets to make use of. Hands are represented by the user’s controllers, or a separate gadget for hands.

This is based on SteamVR’s system: https://steamuserimages-a.akamaihd.net/ugc/942832810028470735/0B275ED133BBAC9CF88680501968370D57AC0A57/

There should be something in this gadget to facilitate other's gadgets being able to block or hide the avatar of another user for obvious reasons. For performance reasons, being able to rasterize an avatar into one object without completely losing the information needed to edit it, would be ideal.

Who would use this gadget?

This gadget would eliminate a step to making any social application, without relying solely on any social program that aardvark might be integrated with (Pluto, etc).

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

I am not experienced with coding or 3D modelling, I’m just trying to provide the concept for consideration.

What will be the toughest part of building this gadget?

I’m not sure what the hardest aspect of this is, but I would imagine handling lighting on the 3D shapes and making sure the avatar is accessible to other gadgets.

AardBoy

What would this gadget do?

This gadget makes a small handheld model of a handheld game console. The screen would be a screen mirror of the user’s desktop and input would be supplied by translating VR controller input into Xinput or DirectInput. Controls are established when the user closes both hands around the console, and broken when the user pulls their hands more than 90 degrees apart. Users can scale the console to be bigger or smaller in order to make the display readable.

This is inspired by the gameboy functionalirty of New Retro Arcade.
screenshot_2019-07-17_15 53 30_720

Who would use this gadget?

This gadget would be used to play PC or emulated games inside VR on top of other applications.

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

I am not experienced with coding or 3D modelling, I’m just trying to provide the concept for consideration.

What will be the toughest part of building this gadget?

Creating a control translation system between steamVR input and directinput/xinput that can be remapped would be difficult.

Static/Dynamic furniture/object tracker

What would this gadget do?

This gadget would create virtual representations of real furniture in your room.

The furniture that this gadget tracks would be split into two categories:
Static objects which aren't moved around regularly (e.g. couches, desks, shelves)
Dynamic objects which are moved around regularly (e.g. office chair, wireless keyboard, maybe a door?). Currently, dynamic objects would be tracked by Vive trackers

There should be a button to toggle display of the objects.

For extra milestones, it'd be nice if static/dynamic objects would "fade in" when the user is close to the edge of their play space. If a dynamic object is in the playspace, it should always be visible.

Who would use this gadget?

People who'd want to switch from standing to sitting or the reverse in VR without taking off the headset/looking through the nose-hole.
People who'd want to type something on their wireless keyboard (or wired keyboard in a generally fixed position) without taking off the headset/looking through the nose-hole
People who'd greedily want more playspace and feel confident that there's more playspace beyond the boundary (e.g. over a couch)
People who'd want less reasons to take off their headset :)

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

This project needs 3d assets, but there should be easy-to-find sample models of couches/desks/chairs.

What will be the toughest part of building this gadget?

The toughest part of this gadget would be representing dynamic objects appropriately depending on where the Vive Tracker is mounted. For example, the Vive tracker could be mounted to the back/left arm/right arm of an office chair.

Show twitch chat (etc) as speech bubbles or similar

What would this gadget do?

Pipe twitch chat into Aardvark as something popping up in front of the VR player while streaming.

Who would use this gadget?

VR streamers on Twitch (and perhaps other platforms if they have accessible APIs)

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

I can do most coding, so I guess 3D modelling? I can make a box in Blender though, uhm.

What will be the toughest part of building this gadget?

Understanding React and how to actually use Aardvark I think, might also be that chat API, I have not had time to look if there's a good package for it yet.

Twitch Chat 3D Audience

What would this gadget do?

Replace 2D Twitch chat windows that many streamers pin in their VR overlays with a miniature 3D bleacher that shows Twitch chatters as simple 3D avatars sitting in the bleacher. What they say can appear in comic chat bubbles above their little avatar. As users join & leave the chat, it is physically represented on the bleacher. Viewers could even use special chat bot commands to alter their little 3D avatar's color or style.

The goal is to replace the 2D visual representation of a streamer's Twitch chat with a 3D physical representation.

Who would use this gadget?

Small audience Twitch streamers who would normally have their boring 2D chat window pinned to their VR overlays.

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

This involves writing a JavaScript Twitch chat client to access the chat & join/exit messages, making 3D models of the bleacher & little 3D avatars for chatters, and creating a method that takes what a Twitch chatter says and writes it to a canvas to position over their avatar's head. I can do all of these things over the course of a couple days, but will gladly split up the work if anybody wishes to assist.

What will be the toughest part of building this gadget?

First, for a JavaScript Twitch chat client to work, the user must type in their Twitch username and paste in their OAuth token into the gadget somehow. I am unsure if this will actually be a challenge to accomplish in Aardvark, but it seems like it very well could be depending on if Aardvark has a way for users to type or paste text into gadgets yet.

Second, making it elegantly scale between drastically differently sized Twitch communities. Large streamers have thousands of viewers and their chat moves insanely fast, so it would not be too useful for them.

To avoid the need to scale, we could simply target smaller Twitch communities that only have < 20 viewers/chatters. This small size actually accounts for the majority of Twitch streamers, including every single one of my friends who Twitch streams. :D

GadgetBox

What would this gadget do?

This gadget acts as a place to store 3D gadgets. Users can reach towards their lower back and pull out a box where they can take out or place gadgets. Users can take gadgets that are out, hold their hand over their lower back, and release in order to store the gadget without taking out the box. It also would allow gadgets to remain on while stored.

Who would use this gadget?

Users would use this to store their 3D gadgets and keep them manageable, and allow gadgets to be used passively. For example a user can start a playlist on a media player gadget, then store it in the box to continue playing music while they use VR.

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

I am not experienced with coding, I’m just trying to provide the concept for consideration.

Controller Assistance (Cross)

What would this gadget do?

This gadget allows an external use to highlight buttons or actions on a controller, in order to explain controls to someone in a headset. This would be done from the desktop or from a phone. Software like parsec would allow someone to do this over the internet, and not require dedicated Aardvark support.

The app would need controller models for most controllers, and either animations for the buttons or just the ability to highlight them and display text saying “press,” “hold,” or “slide.” It needs to clearly show buttons on the back as well through a flashing note coming off the controller. Starting with Index, Vive, and WMR.

A more elaborate version would allow someone externally to mark 3D space relative to the in headset user in order to show an action or point out a game element (where ammo is stored for example, or where a brush tool is in creative software).

Who would use this gadget?

People who are onboarding others into VR at home, at presentations, showing VR or games to friends. It erodes the separation in VR between someone in-headset and someone trying to help them. Many games simply don’t have tutorials or are being played in a way where each players going through the tutorial isn't idea.

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

I am not experienced with coding or 3D modelling, I’m just trying to provide the concept for consideration.

What will be the toughest part of building this gadget?

This seems like it has multiple distinct aspects that all need to be done separately. Making animations, making a desktop UI, making a phone browser based solution especially, and adding and preparing the controller models. Adding a spatial mode to this would be much harder.

Stop Watch

What would this gadget do?

image

  • Starts a timer when interacted with
  • Can add "laps"
  • Stop the timer at the time pressed

Who would use this gadget?

  • Anyone practicing or learning something in VR. It could be an artist tracking how long it tool them to do a certain painting, or a speed running practicing a certain part of a level. It could also be used for prototyping social VR party games as timers are a basic building block of many games

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

Basic Programming skills and someone with UX design experience as how easy it is to use will be a critical aspect of the gadget.

What will be the toughest part of building this gadget?

Making it easy to see and interact with while not getting in the way of whatever is being timed.

Sound Design VR Toolkit

What would this gadget do?

One of the bigger problems for sound designers working in VR is that they mix and work with their sounds in a tool (called a digital audio workstation, or DAW) that is not natively in VR. This toolkit would provide an extension for sound designers to listen and edit sounds spatialized in VR in real time. The first tool I would try to build is just being able to preview a sound spatialized in VR, so bringing up the widget to connect to another app running (the DAW) and being able to play.

Who would use this gadget?

Sound designers in VR

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

I'm a sound designer that has built a custom audio manager in C#. I will probably need an actual programmer to set up the aardvark portion, but I'll muddle through the best I can with what I got!

What will be the toughest part of building this gadget?

Anything with the Aardvark interfacing with the DAW in question. Uncertain if REAPER can actually take commands from another app, but it certainly can do a lot, so I'm hopeful.

A shopify plugin that lets you shop spatially with another person

Shop from Orcas Island Co-Op grocery in VR (https://orcas-food-coop.myshopify.com/)

Audience: I would use this gadget as I find shopping for groceries online to be loathesome, and the local Co-Op is only doing online order / pickup. I have a six other people that would use it.

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

Web programming. Doing shopify extensions would be neat but probably not necessary

What will be the toughest part of building this gadget?

The great unknown. And that I'll probably be doing it myself and I don't really know anything (relevant). :-)

Test Case for a Social Game in Aardvark (Mossman)

What would this gadget do?

This is a test for a social game in Aardvark. Each player gets a card explaining the location they are in (Aperture Lab, Black Mesa before the resonance cascade or after, Mann Co, Xen, Ravenholm) except for one player whose card has a capital “T” in white over a red background. Players ask each other questions about the location they are in, trying to ascertain who is the traitor. Each player has a list of all possible locations. Once they decide they can vote and if correct they win; but at any time the traitor can reveal themselves and attempt to guess the location to win. Players get points for winning as the traitor or for guessing who the traitor is by initiating a vote, but it’s not a competitive game.

Dealing out the cards and keeping a tally of points are scripted, but all other functions can be unscripted.

Who would use this gadget?

This gadget is a party game for small groups. Social games work really well in VR and this one requires little to no movement, making it more viable in roomscale, AR, and inside social VR programs.

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

I actually could make the artwork for the cards, but I am not a programmer or experienced with 3D modelling.

What will be the toughest part of building this gadget?

The toughest part of building this gadget will be coordinating all player’s gadgets to communicate properly. Since all players have severely limited information, a lack of a traitor due to a glitch would not be noticed.

Rear Warning system

What would this gadget do?

Thus gadget alerts the user when they are about to back into an object in their room that they marked with another appropriate gadget. The notification would be a small round pre captured image of the object.

Who would use this gadget?

Users would use this gadget to supplement existing boundary systems and allow for sharing space with static objects in XR.

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

I am not experienced with coding, I’m just trying to provide the concept for consideration.

What will be the toughest part of building this gadget?

The toughest part of making this gadget is making it only activate when the user is not facing the gadget, and well enough to increase safety.

Measuring Tape/Volume

What would this gadget do?

image

Measures stuff!

  • Metric

  • Imperial

  • Length

  • Volume

Who would use this gadget?

  • Any curious how long or big something this
  • VR artists checking in on the sizes of the things they are creating
  • Someone checking out a VR game and seeing if the in-game assets are similar to real world objects
  • With camera passthrough on Vive or Index, could be used for real world objects in your room

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

Basic Programming and animation skills

What will be the toughest part of building this gadget?

The maneuvering and usage of the tool will be hard to get to feel really good.

Input Output Organization (OGAS)

What would this gadget do?

This gadget collects “outputs” from gadgets and connects it to other gadgets as “inputs .” Functioning like a switchboard, it simplifies communication between larger numbers of gadgets and lets users see a list of all communication and information exchange between gadgets.

A simple form of this chain is a desktop mirror widget being fed to gadgets that require a desktop view like a media center gadget or office software; or socially focused gadgets calling for whatever avatar gadget the user has installed. A more advanced example would be a generic UI command calling this gadget to call another gadget to take a screenshot, which is fed into a fourth gadget to spawn the screenshot to share with another user.

Who would use this gadget?

Gadget developers can use this to make their gadgets as compatible as possible with other gadgets without anticipating their functions and troubleshoot if gadget to gadget communication is working. Users can use this gadget to connect gadgets manually, such as having an RSS feed gadget send it’s output of articles to a gadget that spatializes photos, web links, etc.

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

I am not experienced with coding, I’m just trying to provide the concept for consideration.

What will be the toughest part of building this gadget?

The toughest part of making this gadget would be making the system of input and outputs easy for other gadgets to hook into.

Gadget Configuration Manager

What would this gadget do?

This gadget creates a single UI layer that all gadgets can use to offload their own settings and configuration into. Each new gadget will see it’s settings UI placed within GCM and listed below the one before it in the left hand menu, along with any sub menus. Selecting a gadget will open it’s settings on the right side.

GCM would have UI features like a left hand list of gadgets along with any needed submenus and a search option, sliders, incremental adjustment, toggle adjustment, drop down menus, tooltips, and checkboxes. All settings are saved in a common config that all gadgets have access to. The gadget also has a desktop UI as a fail-safe and easy way of configuring multiple gadgets at once.

This is inspired by the Mod Configuration Menu from the Fallout community: https://slack-files.com/TPD5V23JT-F01BDB9414L-ba4f532d84

Who would use this gadget?

Gadget developers can use this gadget to offload the work of making a settings UI. But for users this also makes finding the settings/configuration for any app easy to do. This is also useful because misconfigured settings can make VR applications glitch out, fly away, or otherwise become unusable; this allows users to troubleshoot in this UI.

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

I am not experienced with coding, I’m just trying to provide the concept for consideration.

What will be the toughest part of building this gadget?

Making a fully featured UI is the hardest part of this gadget, with all necessary features like tooltips and sliders that are also highly usable.

Game Text Layer Integration

What would this gadget do?

This gadget would display text or simple graphics (speed dials, etc) a location and orientation that is fed to them by another non aardvark application.

Who would use this gadget?

Game developers would add integration for this gadget, allowing them to offload rendering of fine details like text or dials. Having the text on a separate layer would allow games to add text, dials, small UI elements, etc at a much higher resolution than the main layer of the game. This allows demanding, deferred rending, or other software that is less able to have a high resolution with high super sampling to include fine details or be more performant. High resolution headsets won't necessarily be able to make full use of their resolution (especially with added resolution) in all software but this would mitigate that significantly.

It also opens the door to game/software integration for aardvark and multi app and serves as a good example case to developers for separating out necessary functions to make them more efficient and/or future proofed. Simulators are a notable example where performance is always strained but high resolutions are seen as essential to be able to immersively use the cockpit or drivers seat UI. VR games tend to avoid using text because of readability but this could allow any headset to use the highest resolution and super sampling possible to make smaller text in VR viable.

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

I am not experienced with coding, I’m just trying to provide the concept for consideration.

What will be the toughest part of building this gadget?

Making this gadget high fidelity enough without stuttering, harming performance, or standing out in the main layer too much would be a difficult task, and would require a straightforward way for developers to seamlessly add it to the hundreds of uses of text in their games. Developers need a high level of confidence to make use of aardvark to integrate it with their products.

Cross-app fitness tracker

Wanna see how many calories you've burned since booting up Beat Saber? How about the number of squats and spins in Pistol Whip? Aardvark offers a great way to track and visualize raw motion data so users can see the health impact no matter what app they're running.

What would this gadget do?
Track motion controller and HMD data and visualize it to the user over time. Could potentially track app use as well to offer insights to compare health between experiences.

Who would use this gadget?
People interested in health and fitness in VR. Fans of experiences like Supernatural and Beat Saber that want a unified dashboard to track their health across myriad apps.

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?
My strength is UX design, but I have a solid background 3D modeling. My coding skills are pretty much limited to Blueprint visual-scripting, so I could use a lot of help with development. I also know very little about actual health calculations (such as translating a punch into calories burned) so that would be good to have as well.

What will be the toughest part of building this gadget?
Translating raw input data into useful health insights will be the biggest challenge.

A basic card game to show persistence and interactivity between users' gadgets (Crab)

What would this gadget do?

This gadget would work as a test case in order to show how user’s gadgets can communicate with each other, how they can interact with each other’s gadgets, and maintain persistent elements in an interaction.

Mirroring Skull, Crabs is a simple bluffing game using only a handful of cards. Players, starting with 4 cards, go around placing down a card without revealing what it is, stacking them on top of each other. Players can then either place another card or make bets (auction style) about how many cards, starting with their own then drawing from other’s stacks, they can pick up without revealing a headcrab card. If right they get a point, two points to win. If wrong, they randomly lose one card and start going around again.

Bets, points and other aspects of the game can be built into the gadget, to show off UI based on shared information or even scripting, but doesn’t have to.

One "mode" would be for a player to define a set space for the placed cards to rest on, which would be used to make the play feel more natural over VR software or AR.

Who would use this gadget?

This would be an easy to learn game that could be played in any context and takes only a few minutes to play, making it good for any game or environment with downtime, or any social experience. Programs like hubs would be a perfect place to use it. It also requires no central play space outside of the users hand and the stack of cards in front of their person (players do need to see each other’s stacks).

Assuming that you're on the team, what other skillsets would you need to make this project happen over a couple days of hacking?

I actually could make the artwork for the cards, but I am not a programmer or experienced with 3D modelling.

What will be the toughest part of building this gadget?

The toughest part of making this gadget would likely be adding the UI that sources from common information, scripting rules of the game into the gadget, or making sure that players can draw from each other’s stacks of cards.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.