Oculus unity plugin

Oculus unity plugin DEFAULT

Developers of Oculus Rift games made with the Oculus Unity plugin can now add Windows MR headset support with their existing code.

Facebook first added SteamVR support to the Oculus Unity Integration back in November. In December an update expanded that support, adding VR overlays.

oculus control mapping wmr

Until now this cross-platform support has been limited to the HTC Vive. The latest update adds official support for Windows MR headsets too. The documentation includes details of how the Oculus Touch APIs relate to the Windows MR controllers.

What This Does And Doesn’t Mean

Like with the HTC Vive support this has no direct user facing consequences. Oculus Store games still can only have Oculus API support ticked in Unity. This change won’t make Windows MR work there. What this does do however is lower the technical barriers to building for Rift first and supporting Windows MR later, or wanting to release on both.

A developer of a game in Unity for the Oculus Rift can now add Windows MR support without having to integrate the separate SteamVR Plugin. Instead of having to manage the separate Oculus & SteamVR integrations they can stick with one. We should of course note the SteamVR Plugin can also be used to support both headsets, but this can’t be used for Oculus Store builds. Ultimately, this means developers building for SteamVR first and then planning to submit to the Oculus Store eventually would have to use the Oculus Integration anyway.

Windows MR is nearing 10% market share in the Steam Hardare survey. Hopefully the combination of that shift and this update prompts more developers to add support for Windows MR.

CategoriesArticle

David Heaney

Writes about VR & AR technology news & industry trends. Usually on VR Download.

Twitter

Registration is disabled.

").addClass("theiaStickySidebar").append(a.sidebar.children()),a.sidebar.append(a.stickySidebar)}a.marginBottom=parseInt(a.sidebar.css("margin-bottom")),a.paddingTop=parseInt(a.sidebar.css("padding-top")),a.paddingBottom=parseInt(a.sidebar.css("padding-bottom"));var r=a.stickySidebar.offset().top,d=a.stickySidebar.outerHeight();a.stickySidebar.css("padding-top",1),a.stickySidebar.css("padding-bottom",1),r-=a.stickySidebar.offset().top,d=a.stickySidebar.outerHeight()-d-r,0==r?(a.stickySidebar.css("padding-top",0),a.stickySidebarPaddingTop=0):a.stickySidebarPaddingTop=1,0==d?(a.stickySidebar.css("padding-bottom",0),a.stickySidebarPaddingBottom=0):a.stickySidebarPaddingBottom=1,a.previousScrollTop=null,a.fixedScrollTop=0,e(),a.onScroll=function(a){if(a.stickySidebar.is(":visible")){if(i("body").width()

Sours: https://uploadvr.com/unity-port-rift-games-windows-mr/

Deploying to Oculus Quest

  • 4 minutes to read

An Oculus Quest is required.

MRTK's support for the Oculus Quest comes via two different sources, Unity's XR SDK pipeline and the Oculus Integration Unity package. The Oculus XRSDK Data Provider enables the use of both sources and must be used to deploy MRTK on the Oculus Quest.

The Unity XR SDK Pipeline enables the use of Oculus Touch controllers and head tracking with the Oculus Quest. This pipeline is the standard for developing XR applications in Unity 2019.3 and beyond. To use this pipeline, make sure that you using Unity 2019.3 or newer. This is required to deploy MRTK applications to the Oculus Quest.

The Oculus Integration Unity package allows for the use of hand tracking with the Oculus Quest. This data provider does NOT use Unity's XR SDK Pipeline or Legacy XR Pipeline.

Setting up project for the Oculus Quest

  1. Follow these steps to ensure that your project is ready to deploy on Oculus Quest.

  2. Ensure that developer mode is enabled on your device. Installing the Oculus ADB Drivers is optional.

Setting up the XR SDK Pipeline for Oculus Quest

  1. Ensure that the Oculus XR Plugin is installed under Window --> Package Manager

    Oculus XR Plugin Package

  2. Make sure that the Oculus Plug-in Provider is included in your project by going to Edit --> Project Settings --> XR Plug-in Management --> Plug-in Providers

    Oculus Plugin Provider

Setting up the Oculus Integration Unity package to enable handtracking

  1. Download and import Oculus Integration from the Unity Asset Store. The latest version tested to work is 20.0.0. Older versions can be found from this archive.

  2. Navigate to Mixed Reality Toolkit > Utilities > Oculus > Integrate Oculus Integration Unity Modules. Doing this will update the asmdefs with definitions and references needed for the relevant Oculus Quest code to function. It will also update the csc file to filter out the obsolete warnings produced by the Oculus Integration assets. The MRTK repo contains a csc file that converts warnings to errors, this conversion halts the MRTK-Quest configuration process.

    Oculus Integration Asmdef

  3. In the imported Oculus folder (It should be found at Assets/Oculus), there is a scriptable object called OculusProjectConfig. In that config file, you need to set HandTrackingSupport to "Controllers and Hands".

    Oculus Integration Controller And Hands

Setting up the scene

  1. Create a new Unity scene or open a pre-existing scene like HandInteractionExamples.
  2. Add MRTK to the scene by navigating to Mixed Reality Toolkit > Add to Scene and Configure.

Using the Oculus XR SDK Data Provider

  1. Configure your profile to use the Oculus XR SDK Data Provider
    • If not intending to modify the configuration profiles

    • Otherwise follow the following:

      • Select the MixedRealityToolkit game object in the hierarchy and select Copy and Customize to clone the default mixed reality profile.

      Clone Profile

      • Select the Input Configuration Profile.

      Input Configuration Profile

      • Select Clone in the input system profile to enable modification.

      Clone Input System Profile

      • Open the Input Data Providers section, select Add Data Provider at the top, and new data provider will be added at the end of the list. Open the new data provider and set the Type to Microsoft.MixedReality.Toolkit.XRSDK.Oculus > OculusXRSDKDeviceManager.

      Oculus Add XRSDK Data Provider

  1. Configure your profile to use the Oculus XR SDK Data Provider
    • If not intending to modify the configuration profiles

    • Otherwise follow the following:

      • Select the MixedRealityToolkit game object in the hierarchy and select Copy and Customize to clone the default mixed reality profile.

      Clone Profile

      • Select the Input Configuration Profile.

      Input Configuration Profile

      • Select Clone in the input system profile to enable modification.

      Clone Input System Profile

      • Open the Input Data Providers section, select Add Data Provider at the top, and new data provider will be added at the end of the list. Open the new data provider and set the Type to Microsoft.MixedReality.Toolkit.XRSDK.Oculus > OculusXRSDKDeviceManager.

      Oculus Add XRSDK Data Provider

  1. The Oculus XR SDK Data Provider includes an OVR Camera Rig Prefab which automatically configures the project with an OVR Camera Rig and OVR Hands to properly route input. Manually adding an OVR Camera Rig to the scene will require manual configuration of settings and input.

Build and deploy your project to Oculus Quest

  1. Plug in your Oculus Quest via a USB 3.0 -> USB C cable

  2. Navigate to File > Build Settings

  3. Change the deployment to Android

  4. Ensure that the Oculus Quest is selected as the applicable run device

    Oculus Run Device

  5. Select Build and Run

    • You will likely encounter the following set of build errors when you select Build and Run the first time. You should be able to successfully deploy upon selecting Build and Run again.

    Oculus Expected Build Errors

  6. Accept the Allow USB Debugging prompt from inside the quest

  7. See your scene inside the Oculus Quest

Removing Oculus Integration from the Project

  1. Navigate to the Mixed Reality Toolkit > Oculus > Separate Oculus Integration Unity Modules Oculus Separation Asmdef
  2. Let Unity refresh as references in the Microsoft.MixedReality.Toolkit.Providers.Oculus.asmdef and other files are modified in this step
  3. Close Unity
  4. Close Visual Studio, if it's open
  5. Open File Explorer and navigate to the root of the MRTK Unity project
  6. Delete the UnityProjectName/Library directory
  7. Delete the UnityProjectName/Assets/Oculus directory
  8. Delete the UnityProjectName/Assets/Oculus.meta file
  9. Reopen Unity

Common errors

Quest not recognized by Unity

Make sure your Android paths are properly configured. If you continue to encounter problems, follow this guide

Edit > Preferences > External Tools > Android

Android Tools Config

Sours: https://docs.microsoft.com/en-us/windows/mixed-reality/mrtk-unity/supported-devices/oculus-quest-mrtk
  1. Spray.bike copper
  2. Battle tech miniatures
  3. Kindergarten background images
  4. P8 ibm

6 Steps to Set Up Your Oculus Rift with Unity in Less Than 10 Minutes [2021 Updated]

Even though Oculus Rift got overshadowed by his younger and more nimble brother Oculus Quest 2, it is still a popular PC VR headset.

With the Oculus Link, millions of Quest 2 users can now connect their headset to their PCs and enjoy a huge library of PC VR games. With the recent upgrade of 'Air Link' feature even the USB cable isn't necessary.

This means if you own an Oculus Rift, you have a perfect opportunity to build and test high-demand VR applications for an ever-growing base of Oculus users.

So it's imperative to know how to set up your Oculus Rift for developer mode.

Once in developer mode, you'll be able to start development in Unity with Oculus assets and test your apps on the go!

A couple of technicalities, in case you just ordered the Rift S and you want to use it for your gaming laptop - Oculus decided to put an USB port on one side of the cable (all good so far) and a Display Port in the end of the cable which of course is outdated and non-compatible for most computer's I/Os.

To avoid the issue, order a Displayport to Type-C adapter ahead of time. We recommend getting a Type-C instead of HDMI port adapter since most of the users report issues with HDMI adapters.

The setup process for Rift and Rift S is identical. Rift S is just an upgraded version with sharper LCD display. 

How to Prepare Your Oculus Rift For Development

At this point we are assuming that you've already used Oculus Rift before and are familiar with their core features such as the guardian system (where you trace the edge of your play area to stay within guardian boundaries while in VR), and have gone through the full setup already.

To create apps for the Oculus, you need a laptop/desktop computer, 1x USB headset cable, copy of Unity (previously known as Unity3D) and Oculus Integration package.

  • Unity  is an application that lets you build, test, and publish your original content on Oculus Home! In other words, it's a content creation engine for building VR applications.
  • Oculus Integration Package is a Unity plugin that allows you to build apps for the Oculus with the correct framework.

To start things off, Usman will be sharing our Oculus Rift set up video with Unity. Next, we'll go over a three step recap of the video and some common issues.

Once you've downloaded Oculus Rift software make sure you have the latest Oculus updates installed. Depending on the time of the setup, the Oculus software package is the size of 5GB and more.

If you're having issues with the initial setup, we recommend that you contact customer support at Facebook technologies. Some problems are trivial, for example, having sufficient disk space on your computer and making sure that your computer satisfies the minimum requirements such as graphics card or RAM for the best VR experience.(Check out our guide on VR hardware or check your desktop capabilities with the new device with SteamVR performance checker).

Let’s take a deep dive into virtual reality development!

Download our 10-week XR Development with Unity Course Syllabus

Skip to the Technical Steps

Step 1: Connect Your Oculus Rift with USB

Before we start, open the Oculus Companion App on your phone and go to Settings > More Settings > Developer Mode and toggle the Developer Mode on.

Now plug your Oculus Rift in your PC using a USB cable. Make sure you’re using a USB-C cable or Oculus Link cable. In the following prompt, Allow USB Debugging and check whether Always allow from this computer is on.

Now we’re able to launch and test VR applications using your virtual reality headset! Let’s proceed to setting up your Unity environment.

Do I Need a Facebook Account to Develop Apps With my new Oculus Rift?

According to the Facebook guidelines for Oculus users, everyone who uses previous generation Oculus devices for the first time will need to log in with a Facebook account. The guidelines were introduced in October 2020.

So you need a Facebook account if you want to access Oculus Store and purchase content. But it’s a bit more complex than that.

If you want to use Oculus Rift exclusively for development purposes without accessing content from the Oculus Store, you have a choice: either connect your development account to your personal Facebook or set up a Test User account to test your VR applications.

In 2023, Facebook will cease supporting legacy Oculus Accounts and might require Facebook accounts from new developers that are not using Oculus for Business Platform.

Step 2: Prepare Unity for VR Development

We're going to start by downloading the newest version of Unity engine. Install Unity on your computer by running the installer package and following simple instructions. Once you have Unity installed, create a new 3D project by clicking File > New Project. Name your project, select 3D from the Template dropdown menu and click Create.

Once you’ve created the new project, click on File then Build Settings. Let’s go through a couple of settings here:

  1. Set Platform to PC, Mac & Linux Standalone.
  2. Under Target Platform, select Windows (or other OS if it's different).
  3. You can toggle Development Build later to test and debug the app.
    Note: Clear this selection when you’re to release the final build.

Finally, let’s enable VR support before we start building our first VR application:

  1. Go to Edit > Project Settings.
  2. Click XR Plugin Management, then go for Install XR Plugin Management.
    (Source)
  3. Open the Windows tab and select Oculus to install the Oculus XR plugin and enable the VR support in Unity.

    Note: If you’re using older Unity versions (before 2019.3), follow these instructions instead to enable VR support in Unity:
    1) From the Build Settings menu, click Player Settings > XR Settings and check Virtual Reality Supported.
    2) Make sure “Oculus” appears in the list. If it doesn’t, click the plus symbol below and select “Oculus” from the dropdown menu that appears.

We’re done with preparations. Let’s dive into creating our testing VR environment in Unity!

Step 3: Connect the Oculus Integration Plugin to Unity

Follow the screen instructions: click on the Asset Store tab (Window > Asset Store) and search for “Oculus Integration” at the top.


First, you need to download the package. After that proceed by clicking Import in the Asset store. To finish the import, click the Import button as shown below:

After that the Oculus Integration package will be imported into our current project.

Now we can use Oculus package components and assets to quickly set up our Virtual Reality environment.

Let’s start with setting up Oculus Camera that represents the position of VR headset and your players’ eyes.

Step 4: Set Up The Camera (OVRCameraRig and Main Camera)

In your Assets tab, navigate to Oculus > VR > Prefabs.

From there, click and drag the OVRCameraRig prefab into your scene.

Your project is now acting funny as there are two different cameras in one scene. The default Main Camera and the OVRCameraRig that we’ve recently added. Delete your original Main Camera in order to avoid conflict with your newly imported OVRCameraRig.

Browse your Hierarchy tab and find the Main Camera object, then right click on it and select Delete.

Now that we got the Main Camera out of the way, let’s set up our OVRCamera Rig.

Click on the OVRCameraRig in your Hierarchy tab. Then, take a look in the Inspector tab on the far right of your screen for Tracking Origin Type in the OVR Manager (Script) component. Click the Tracking Origin Type property so a drop down menu appears, and select Floor Level.

By doing this we are fixing the user's position and orientation relative to the floor. The position in VR is calculated based on the user’s height which they indicate when setting up their Rift device.

In other words, regardless of the users of your app, it will adjust to fit every one!

Step 5: Enable Oculus Touch Controllers

Now that we’ve set up our camera, we need to set up the controllers to represent the player’s hands.
If you want the player to use their Oculus Avatar hands, navigate to Oculus > Avatar > Content > Prefabs under the Assets panel

Drag the LocalAvatar prefab into your scene:

Done! Now that we have the controllers in the scene, your player will be able to interact with objects within the application you build. But we still need to make the controllers visible as avatar hands.

In your Hierarchy Tab, click the LocalAvatar game object. Then, check out the Inspector tab. Under the OVR Avatar (Script) component in the Inspector tab, find the “Start With Controllers” property and make sure it’s selected.

Save the scene. You now have a project that’s ready to develop for the Oculus Rift!

Download our 10-week XR Development with Unity Course Syllabus

Step 6: Click Play!

Turn on your Oculus Rift controllers and click Play in Unity.

You should be able to see your scene as well as your two touch controllers. If your scene doesn’t look like much, that’s because it isn’t at this point. Now that you’re all set up, it’s time to get building and turn that space into something incredible!

You should also see your controllers moving around in the Scene tab.

Before you move ahead with your project, make sure you’re familiar with the capabilities of your hardware. Start by checking out all of the button inputs on your controllers. You’ll find it has a range of input options that include:

  • Button One, Button Two, Button Three ,Button Four, Button Start, Button PrimaryThumbstick and Button SecondaryThumbstick
  • PrimaryIndexTrigger, SecondaryIndexTrigger, PrimaryHandTrigger, SecondaryHandTrigger, PrimaryThumbstick and SecondaryThumbstick

As you continue to develop your scene, make sure you take advantage of the huge amount of resources at the Oculus Developer Center.

Remember, if you run into any trouble or need one-on-one guidance from Unity pros, you can always find it in our live 10-week course or 1-on-1 packages with expert AR/VR mentor.

And that's it!

Hide your cords out of your way and tighten your velcro straps. Unfortunately the wireless options for Oculus Rift HMD or Rift S. isn't available and Facebook hasn't updated the software for Rift devices in a while. Sorry.

Common Error: I Can't See My Controller!

If you can't see your Touch Controllers in the scene, and Unity is giving you an error in the Project tab, try these fixes:

        1. Reinstall the latest version of  Unity.
        2. Reinstall the latest version of the Oculus app. Sometimes, the Oculus app doesn't update past a certain version and reinstalling can be a quick fix. Download Oculus here.
        3. If prompted, update your firmware (headset, controllers and sensors).
        4. Make sure your Oculus Touch controllers are paired with your headset. From the Oculus app, click Devices > Right Touch / Left Touch >  Forget Device on your old controllers. Re-Pair your controllers under Device Configure Rift Pair Touch. If both controllers say On you should be good to go!

       

    1. What's next? If you're just starting out, join one of our live or on-demand VR Workshops. Or if you prefer to get right into it, read our guide on grabbing objects in Unity.Attention: use the latest drivers from Oculus and Unity to support your AR/VR development needs. 
    2. Alternatively, look at out complete 10-week course on XR Development with Unity course syllabus.
    3. Download VR Course Syllabus


      Technical Steps:

        1. Go to Build settings > Player Settings > XR Settings and check Virtual Reality Supported. Make sure “Oculus” appears in the list otherwise click the plus symbol below and select it from the dropdown menu that appears
        2. Go to the Unity Asset Store tab(Click Window > Asset Store) in Unity and search Oculus Integration in the search field at the top.
          Download and import the Oculus Integration into your project.
        3. At the end of the import you may see a prompt that says API update required. Click on “I made a backup, go ahead!”
          Navigate to OVR > Prefabs in your assets tab.
        4. Click and drag the OVRCameraRig prefab into your scene.
        5. Delete the Main Camera from the scene in the hierarchy tab as it will interfere with the OVRCameraRig. Select OVRCameraRig from your Hierarchy tab, then look in the inspector tab. Look for “Tracking Origin Type” in it’s “OVR Manager(Script)” component. Click that property so a drop menu appears, select “Floor Level”.
        6. To enable the Oculus Avatar with hands and to use the controllers navigate to OvrAvatar > Content > Prefabs in your assets tab.
        7. Click and drag the LocalAvatar prefab into your scene.
        8. Select LocalAvatar from your Hierarchy tab, then look in the inspector tab. Look for “Start With Control” in it’s “OVR Avatar(Script)” component, make sure the box next to it is checked.
        9. Save the scene! You now have a project ready to develop for the Oculus Rift!
Sours: https://circuitstream.com/blog/oculus-unity-setup/
Unity VR Development Oculus Quest 2021, Getting Started in 15 Mins

The latest version of the Oculus Integration for Unity, v23, adds experimental OpenXR support for Quest and Quest 2 application development. A new technique for reducing positional latency called ‘Phase Sync’ has been added to both the Unity and Unreal Engine 4 integrations; Oculus recommends that all Quest developers consider using it.

OpenXR Support for Oculus Unity Integration

OpenXR, the industry-backed standard that aims to streamline the development of XR applications, has made several major steps this year toward becoming production ready. Today Oculus released new development tools which add experimental OpenXR support for Quest and Quest 2 applications built with Unity.

OpenXR aims to allow developers to build a single application which is compatible with any OpenXR headset, rather than needing to build a different version of the application for each headset runtime.

While Unity is working on its own OpenXR support, the newly released v23 Oculus Integration for Unity adds support for an “OpenXR experimental plugin for Oculus Quest and Oculus Quest 2.” This should allow for the development of OpenXR applications based on the features provided by the Oculus Integration for Unity.

Earlier this year Oculus released OpenXR support for building native Quest and Rift applications as well.

Phase Sync Latency Reduction in Unity and Unreal Engine

The v23 Oculus Integration for Unity and for Unreal Engine 4 also bring new latency reduction tech called Phase Sync which can reduce positional tracking latency with ‘no performance overhead’, according to Oculus. The company recommends “every in-development app to enable [Phase Sync], especially if your app is latency sensitive (if it uses hand tracking, for example).”

While Quest has long used the Asynchronous Timewarp to reduce head-rotation latency by warping the rendered frame to the most recent rotational data just before it goes to the display, positional tracking doesn’t benefit from this technique.

One way to reduce positional tracking latency is to minimize the amount of time between when a frame starts rendering and when it actually reaches the display. Ideally the frame will finish rendering just before being sent to the display; if it finishes early, all of the time between when the frame is finished and when it is sent to the display becomes added positional latency.

Phase Sync introduces dynamic frame timing which adjusts on the fly to make sure frames are being completed in an optimal way for latency reduction.

Unlike the Oculus PC SDK, the Oculus Mobile SDK has been using fixed-latency mode to manage frame timing since its inception. The philosophy behind fixed-latency mode is to finish everything as early as possible to avoid stale frames. It achieves this goal well, but with our release of Quest 2, which has significantly more CPU and GPU compute than our original Quest, a lot of apps can finish rendering their frames earlier than planned. As a result, we tend to see more “early frames” […]

Compared with fixed-latency mode, Phase Sync handles frame timing adaptively according to the app’s workload. The aim is to have the frame finish rendering right before our compositor needs the completed frame, so it can save as much latency as possible, and also not missing any frames. The difference between Phase Sync and fixed-latency mode can be illustrated in the following graph on a typical multi-threaded VR app.

Luckily, turning on Phase Sync is as easy as checking a box with the v23 Unity and Unreal Engine integrations from Oculus (details here).

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.

Ben Lang

Sours: https://www.roadtovr.com/oculus-unity-openxr-quest-phase-sync/

Plugin oculus unity

About the Oculus XR Plugin :

The Oculus XR Plugin enables you to build applications for Oculus devices.

The XR Management package will now serve as the main entry point for downloading the right package for each target SDK/platform and managing respective settings.

To install the Oculus XR Plugin, do the following:

  1. Install the XR Management package from Package Manager
  2. Once installed, the XR Management package will take you to the project settings window.
  3. Click on the XR tab
    • Note: The XR tab will not exist in Project Settings unless the XR Management package has been installed.
  4. In the XR tab window, click on the plus button to add a Plugin Loader
  5. Go to Download -> Oculus Loader

The Oculus XR Plugin will automatically download via the Package Manager.

Note that the XR settings tab now has a dropdown for "Oculus". Navigate to the XR -> Oculus settings window in Project Settings to create an Oculus XR Plugin specific settings asset. This asset is editable from the Oculus settings window and can toggle settings such as Dash Support, Shared Depth buffer support, and the Rendering Mode.

Requirements

This version of Oculus XR Plugin is compatible with the following versions of the Unity Editor:

Release Notes

Known Issues

  • Oculus Integration utilities are not currently fully supported when using the Oculus XR Plugin
  • [Quest][Rift S] controllers report a "Thumbrest" usage, which they do not have.
  • Oculus XR Plugin is failing some publishing validation tests
    • VRC validator test
    • TestSubmitFramesWhenNotVisible
    • TestResponseToRecenterRequest (fails with Integrated Packages as well)
    • TestAppShouldQuit
  • Camera is in different position when entering playmode compared to Legacy XR when using the TPD
    • It is possible to work around this by:
    • Create empty game object
    • Set it at 0, 1, -10 (the default camera location)
    • Make the camera a child of this game object, set camera transform to 0, 0, 0.
    • Enter playmode and camera is now in the expected location
  • Changing the render viewport scale and opening the oculus dash while depth and dash support are enabled causes unexpected rendering artifacts
  • Render viewport scale is broken when HDR is off (this affects Integrated XR as well)
  • XRDevice API values are not populated (Documentation)
  • Rift S is not fully supported
  • View is not landscape locked when deploying to GearVR in developer mode
  • Known issues with Single Pass Insancing on PC:
    • Shadows may have issues when using the LWRP (Lightweight Render Pipeline)
    • Depth Sharing doesn't currently work with Single Pass Instancing

Document revision history

DateReason
June 10, 2019First official preview version of package.
Sours: https://docs.unity3d.com/Packages/[email protected]/manual/index.html
How to develop VR cross-platform content in Unity (Oculus, Vive Wave, Steam VR)

.

Similar news:

.



245 246 247 248 249