benArcen

Unity Questions Thread

Recommended Posts

Getting angles to work with you in Unity is a bit tricky in 2D, just keep in mind a few notes.  First off, a 2D vector can be represented as a angle, and in Unity's coordinate system a "zero" angle here would be a line pointing directly to the right (such as the Vector3.right property).  I've had to do this quite a bit in my game and I've had success with the following methods.  Basically you'll be able to convert any direction into an angle by passing the object's transform.right property in as the argument in the Vector2ToAngle method, and from there you can do what you like.  Note that all but the last method normalize the values so that they will be positive.  Generally speaking with 2D in Unity you should use the transform.right property as though it were that object's forward direction so all objects, at a zero rotation, are oriented to be facing to your right.

 

public static float Vector2ToAngle(Vector2 aDirection)
{
    float degrees = Mathf.Atan2(aDirection.y, aDirection.x) * Mathf.Rad2Deg;

    if (degrees < 0)
    {
        degrees += 360;
    }

    return degrees;
}

public static Quaternion GetZRotation(Vector3 aDirection)
{
    return Quaternion.Euler(0, 0, Vector2ToAngle(aDirection));
}

public static float AngleDifferenceBetweenXYVectors(Vector3 aXYforwardDirection, Vector3 aXYangleToVector)
{
    float angle = Vector2.Angle(aXYforwardDirection, aXYangleToVector);

    if (angle < 0)
    {
        angle += 360;
    }

    return angle;
}

public static float SignedAngleBetweenXYVectors(Vector3 aXYForwardDirection, Vector3 aXYAngleVector)
{
    return Vector2.Angle(aXYForwardDirection, aXYAngleVector);
}

Share this post


Link to post
Share on other sites

Thanks so much for all the help! Hadn't responded yet because I've been too busy to try implementing it but the explanation makes sense so I should be able to manage it.

Maybe some day I can know a thing and help people... a coder can dream...

Share this post


Link to post
Share on other sites

I got my whole LookAt/2D collider thing sorted, thanks for all the help Thumbs! You all rock.

 

I have a question that is actually more of a question than a problem for once. I find that sometimes GetButtonUp can be unreliable, in that buttons can be no longer pressed but not trigger an if using GetButtonUp.

Has anyone else found this? Is it an intended behaviour where more specific conditions actually trigger GetButtonUp, is it an unreliable function or is it related to maybe bad hardware that doesn't always send the right input signals?

 

I can easily fix my code to work without it, I just wanted to know if anyone else here had experience of this and could illuminate it for me.

Share this post


Link to post
Share on other sites

I've never had this problem myself but I've heard about it from a few places.  I know that when working with different types of buttons Unity is quite picky about the InputManager's settings as well as each axis' settings, so first I would give those a once over to make sure the values are being read correctly.

Share this post


Link to post
Share on other sites

What condition are you trying to get out of it, SBM? Just from your wording it sounds like you want to know when it's not being pressed, but the way it works is it's only true on the frame you stopped pressing a button, not all the time the button is not pressed. Same with GetButtonDown, it's only true on the frame you started pressing a button.

 

The most reliable way to check for this is to check if GetButton is true or false. If it's false, then the button isn't being pressed.

Share this post


Link to post
Share on other sites

What I'm trying to do is have a projectile charging up while the space bar is held down and then it fires when it's released. I was using Get Button and Get Button Up as the triggers for these. So I was trying to hear trigger it on one frame but it sometimes didn't actually fire the first time. But I know I can just have a condition where the charging variable is greater than one and Get Button is false, so this instance is solved. I'll check over the input settings, see if I have anything funny in there.

Share this post


Link to post
Share on other sites

Hmm, yeah, ok that's weird.. the only thing I could think of is if you were checking for input in FixedUpdate instead of Update?

Share this post


Link to post
Share on other sites

... Actually that might be it. I should pay more attention to the difference between update and fixed update. I've mostly kept everything in fixed update but I know that's not right.

Share this post


Link to post
Share on other sites

Yeah, that might bugger it up, since FixedUpdate doesn't run every frame, thus missing the one frame where you let go or press the button!

Share this post


Link to post
Share on other sites

I try to put all my input-checks in Update(). That's one of my general rules. It's been working well for me.

So for something that uses the physics system (when I'm not feeling lazy) I put an input-check in Update() put the physics stuff in a method I write, and then put that method into fixedUpdate() conditional to a bool from Update().

I just made it sound more complicated than it feels. 

This is usually what my code ends up looking like. There may be a better way.

 

bool doingPhysics;

void PhysicsFireOff()
{
  if(doingPhysics==true)
  {
    this.rigidbody.AddForce(5 or whatever you put in here, I forgot);
  }
}

void Update()
{
  if(Input.GetAxis("axisName")!=0f)
  {
    doingPhysics=true;
  }
  else
  {
    doingPhysics=false;
  }
}

void FixedUpdate()
{
  PhysicsFireOff();
}

    


Share this post


Link to post
Share on other sites

Yes, it is a good idea to treat Update() as the default place to update the state of your game unless you have a good reason to use FixedUpdate().

 

Physics are a good example and they are probably the reason FixedUpdate() exists in the first place. Physics systems try to simulate non-trivial continuous real-life phenomena in discrete steps and there are good reasons to have these time steps fixed. If we used variable time steps with fps compensation (i.e. multiplying the values we add to something using Time.deltaTime) we would tie the behavior of our game to the framerate. We could miss some physics interactions on lower framerates and also things that are not linear (i.e. things whose effects won't be twice as large when doubling the cause variables, e.g. applying acceleration to a body) would behave differently for different framerates unless we really took great care to compensate for that. Also because computer-represented floating point values have variable precision, various physics systems are often tuned for a range of distance, mass and time values the user can use without risking losing stability and having fixed time step helps to ensure these conditions (see e.g. falling through an elevator floor when running Dark Souls on 60 fps).

 

On the other hand, FixedUpdate() is not really tied to the frames being drawn so apart from possibly missing some important events everything we update in FixedUpdate() will look janky. In fact Unity physics can look pretty janky unless the bodies have interpolation or extrapolation enabled (or have a look at the first Bioshock as a non-Unity example, the physics have a low framerate even when the game itself runs smoothly which looks pretty weird). It's generally better to use Update() for non-physics movement (as in performed in a script and not using Physics or Physics2D) and compensate for variable time steps using Time.deltaTime.

 

Hopefully that shed some light on why both Update() and FixedUpdate() exist and what are their pros and cons.

Share this post


Link to post
Share on other sites

iax beat me to it, but yeah, FixedUpdate pretty much exists for physics-fudgin'.

 

Also physics engines are weird and despite being designed to be predictable... sometimes unpredictable. It's my favorite part of games programming! :3

Share this post


Link to post
Share on other sites

Does anyone know of a way to simulate touch controls in-editor on a PC? I was hoping I could just create dummy Touch objects, but every field/property in Touch is read-only, as is Input.touches. I have Unity Remote working so I can test using the iPad connected to my Macbook, I just really prefer developing on my desktop and it'd be handy.
 

edit: Chatted with SpennyDubz about this in IRC for a second, his suggestion (or my understanding of it, anyway) was to have an input wrapper class to handle all input and reference that rather than directly reading Input.touches. Which is a good suggestion and appears to be the correct way to do this, I was just hoping for...well, an easier way.

Share this post


Link to post
Share on other sites

Can confirm I used Fixed Update, am rectifying this now. Thanks again Thumbs! You're all going into the credits sequence when I finally make a full game.

Share this post


Link to post
Share on other sites

edit: Chatted with SpennyDubz about this in IRC for a second, his suggestion (or my understanding of it, anyway) was to have an input wrapper class to handle all input and reference that rather than directly reading Input.touches. Which is a good suggestion and appears to be the correct way to do this, I was just hoping for...well, an easier way.

 

If you're looking for a good input helper type class for Unity I'd recommend cInput 2.  Basically they give you an Input Manager that is set up to read all possible axes, then you just pick and choose the ones you want with their wrapper class.  It'll allow you to have rebindable inputs at runtime (stock Input manager can't) and you still have access to all of Unity's Input stuff if you still want it.  As far as touches go, it may appear daunting but really isn't too bad.  You'll want to put something together that just looks at the touches Unity gives you and figures out what is happening from there.  I remember a while back some MIT students made some kind of a XNA-like Input wrapper for Unity, I'll put a link into this post if I can find it.  Also, XNA's touchscreen API is pretty easy to understand and wouldn't be that hard to replicate if you need something to start from.

 

http://msdn.microsoft.com/en-us/library/microsoft.xna.framework.input.touch.aspx

Share this post


Link to post
Share on other sites

If you're looking for a good input helper type class for Unity I'd recommend cInput 2.  Basically they give you an Input Manager that is set up to read all possible axes, then you just pick and choose the ones you want with their wrapper class.  It'll allow you to have rebindable inputs at runtime (stock Input manager can't) and you still have access to all of Unity's Input stuff if you still want it.  As far as touches go, it may appear daunting but really isn't too bad.  You'll want to put something together that just looks at the touches Unity gives you and figures out what is happening from there.  I remember a while back some MIT students made some kind of a XNA-like Input wrapper for Unity, I'll put a link into this post if I can find it.  Also, XNA's touchscreen API is pretty easy to understand and wouldn't be that hard to replicate if you need something to start from.

 

http://msdn.microsoft.com/en-us/library/microsoft.xna.framework.input.touch.aspx

 

Ooh, cInput looks neat. I've actually seperately been looking for a way to dynamically bind controls in order to handle local multiplayer games, without having to set each input up individually for each player, and that seems like it'll work. 

 

I have working touch controls using the stock Input.Touch stuff, the problem is that I can't easily test it without copying builds onto my macbook and running Unity Remote, so I was hoping for a way to emulate touches using the mouse. The axis stuff in cInput seems like it'll solve my testing problems, thanks for the recommendation!

Share this post


Link to post
Share on other sites

I'm using InControl, just to throw that in the suggestion bin for input managers. I'm not sure if it's any good compared to cInput, but it was easy enough to setup and got me quickly using an 360 controller with standardized controls across Mac/PC (for some reason all of the buttons map completely different in regular Unity Input)

 

It also just sounds nice, like I'm... in control.

Share this post


Link to post
Share on other sites

I'm getting an error when I ask for an array.length, but only in the set method. 

 

These two snippets are from the same script:

This one returns an error.

public void PopulateBlocks(Transform _block, int column, StepProperties block_stepProperties)
	{
		//Debug.Log (_block);
		//Debug.Log (column);
		//Debug.Log (block_stepProperties);
//Here starts the problem
		Debug.Log ("Notes.populate.stepBlocks[]"+stepBlocks.Length);
		//Debug.Log ("Notes.populate.step_stepProperties[]"+step_stepProperties.Length);
		//stepBlocks[column]=_block;
		//step_stepProperties[column]=block_stepProperties;
	}

This one does not

void Start()
{
   steps=new int[stepQuantity];
   stepBlocks=new Transform[stepQuantity];
   step_stepProperties=new StepProperties[stepQuantity];
   Debug.Log ("notes.stepBlocks.length"+stepBlocks.Length);
}





What am I missing?

Share this post


Link to post
Share on other sites

Could you include the other bits of code that interact with those methods?  Or perhaps the error you are seeing (if any) in the console, or perhaps just the intended result vs. what is happening.  I can't quite tell without some more info, but my best guess is that the PopulateBlocks() method is firing before the Start() method where the arrays are initialized if the error is related to the Array's length.

Share this post


Link to post
Share on other sites

Could you include the other bits of code that interact with those methods?  Or perhaps the error you are seeing (if any) in the console, or perhaps just the intended result vs. what is happening.  I can't quite tell without some more info, but my best guess is that the PopulateBlocks() method is firing before the Start() method where the arrays are initialized if the error is related to the Array's length.

 

I can, but I have a custom script-execution order. This script is supposed to go before the script that accesses it. 

I'll try to include more information though.

 

WSgX1cp.png

 

This is the error:

NullReferenceException: Object reference not set to an instance of an object
Notes.PopulateBlocks (UnityEngine.Transform _block, Int32 column, .StepProperties block_stepProperties) (at Assets/Scripts/Sequencer/Notes.cs:41)
GridMaker.MakeGrid () (at Assets/Scripts/Sequencer/GridMaker.cs:173)
GridMaker.Start () (at Assets/Scripts/Sequencer/GridMaker.cs:71)

Here is all of the Notes.cs and What I imagine is the important part of the Gridmaker.cs

using UnityEngine;
using System.Collections;

public class Notes : MonoBehaviour {

	Transform _transform;

//Audio
	AudioSource _audioSource;
	float pitch;
	float pitch_initial;
	float fadeInRate;
	float fadeOutRate;
	bool fadingIn;
	bool fadingOut;
	float volume_initial;
	
//Steps
	int stepQuantity=500;
	int[] steps;
	//List of blocks of this note.
	Transform[] stepBlocks;
	StepProperties[] step_stepProperties;


	
	

	public void SetPitch(int _notesInScale, float row)
	{
		pitch=(row+1)*(pitch_initial/_notesInScale);
	}

	//Puts new blocks into this Note's list of blocks.
	public void PopulateBlocks(Transform _block, int column, StepProperties block_stepProperties)
	{
		//Debug.Log (_block);
		//Debug.Log (column);
		//Debug.Log (block_stepProperties);
//Here starts the problem
		Debug.Log ("Notes.populate.stepBlocks[]"+stepBlocks.Length);
		//Debug.Log ("Notes.populate.step_stepProperties[]"+step_stepProperties.Length);
		//stepBlocks[column]=_block;
		//step_stepProperties[column]=block_stepProperties;
	}

	public void Debugging(Transform _noob)
	{
		//Debug.Log (_noob);
	}	

//	public void SetQuantity(int _stepQuantity)
//	{
//		stepQuantity=_stepQuantity;
//	}
	
	//Delegates
	void OnEnable()
	{
		Metronome.TempoBeat+=CheckBeat;
	}
	void OnDisable()
	{
		Metronome.TempoBeat-=CheckBeat;
	}

	void CheckBeat(int _beat)
	{
		bool isActivated=step_stepProperties[_beat].GetActivated();
		if(isActivated==false)
		{
			return;
		}
		
		if(isActivated==true)
		{
			fadeInRate=step_stepProperties[_beat].GetFadeInRate();
			fadeOutRate=step_stepProperties[_beat].GetFadeOutRate();
			MakeNoise (_beat);
			//stepIsBeat=true;
		}
		else
		{
			fadingOut=true;
		}
		
	}

	void MakeNoise(int _beat)
	{
		//Move the speaker to the position of the current block in the sequence.
		_transform.position=stepBlocks[_beat].position;
		fadingOut=false;
		fadingIn=true;
	}
	void VolumeFadeIn()
	{
		if(_audioSource.volume+fadeInRate>volume_initial)
		{
			//I think this sets the volume back to zero
			_audioSource.volume=volume_initial;	

			fadingIn=false;
			return;
		}
		_audioSource.volume=_audioSource.volume+fadeInRate;
		
	}
	void VolumeFadeOut()
	{
		_audioSource.volume=_audioSource.volume-fadeOutRate;
		
		if(_audioSource.volume<fadeOutRate)
		{
			fadingOut=false;
		}	
	}

	// Use this for initialization
	void Start () 
	{
		_transform=this.gameObject.GetComponent<Transform>();

		_audioSource=this.GetComponent<AudioSource>();
		pitch_initial=_audioSource.pitch;
		volume_initial=_audioSource.volume;

//Copied from Step Properties
		_audioSource=this.GetComponent<AudioSource>();
		_audioSource.volume=0f;
		//I want the fade rates to actual be fractions of the real volume.
		fadeInRate=fadeInRate*volume_initial;
		//Otherwise there is no sound
		if(fadeInRate>volume_initial)
		{
			fadeInRate=volume_initial;
		}
		fadeOutRate=fadeOutRate*volume_initial;
//NOt sure bout this stuff
		steps=new int[stepQuantity];
		stepBlocks=new Transform[stepQuantity];
		step_stepProperties=new StepProperties[stepQuantity];
		Debug.Log ("notes.stepBlocks.length"+stepBlocks.Length);
		
	}
	
	// Update is called once per frame
	void Update () 
	{
//Copied from Step Properties
		if(fadingIn==true)
		{
			VolumeFadeIn();
		}
		if(fadingOut==true)
		{
			VolumeFadeOut();
		}
	
	}
}
 
 //Making the grid.
		for(int i=0; i<width; i=i+1)
		{
			
			for (int j=0; j<height; j=j+1)
			{
				//I only need to generate speaker_moving once per row.
				if(i==0)
				{
					GameObject speakerNew=Instantiate(_speaker_mover, new Vector3(_transform.position.x+(i*(spacing_wide+prefab_width)), _transform.position.y+(j*(spacing_high+prefab_height)), _transform.position.z), Quaternion.identity) as GameObject;
					speakers[j]=speakerNew.GetComponent<Notes>();
					speakers[j].SetPitch(height, j);
					//Debug.Log ("GridMaker.speaker made["+j+"]");
				}
				//Creation of the blocks
				GameObject noob=Instantiate(_prefab, new Vector3(_transform.position.x+(i*(spacing_wide+prefab_width)), _transform.position.y+(j*(spacing_high+prefab_height)), _transform.position.z), Quaternion.identity) as GameObject;
				
				//Assignment of the StepProperties values
				StepProperties _props=noob.GetComponent<StepProperties>();
				//I shouldn't need this since the speaker will be keeping track of which block is which step
				//_props.SetStep(i);

				//setting initial activation
				float _num =Random.Range (0f, 1f);
				//This is the likelihood that the block will be activated initially
				if(_num<=0.22)
				{
					_props.SetActive(true);
				}

				//Setting fades
				_props.SetFadeInRate(_numFadeIn);
				_props.SetFadeOutRate(_numFadeOut);
				
				//Setting colors
				_props.SetColors(color_initial, color_activated, color_beat);

				
				//Debug.Log ("GridMaker.noob.transform="+noob.transform);
				speakers[j].Debugging(noob.transform);
				//Put new block in its speaker's array of blocks.
				speakers[j].PopulateBlocks(noob.transform, i, _props);
				

			}
		}		
	} 

 

Share this post


Link to post
Share on other sites

Well for null reference errors, the way I always solve them is to look for the reference type variables involved and do a 

if (object == null)
{
    print("Object is null");
}

just to figure out what is actually null.  There are a set of tools out there for integration with visual studio (which you can use as a line by line debugger) but I've never gotten them to work consistently.  From what I see here, this is what I think you should check

 

Just a thought, the variable _prefab you use during the instantiate call doesn't seem to be referenced anywhere, is it populated?  If not this would result in the noob object being null.  Also, and I don't know if you are doing this, but if you have a prefab reference itself the Instantiate call will actually create a duplicate of the current in-scene object rather than the prefab it is based on.  I only mention this because I made this mistake once with an enemy that can split multiple times and it took me a while to figure out since it can create odd situations.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now