admin管理员组

文章数量:1026620

I'm doing a Unity project where I have the need to convert UTM coordinates to latitudes and longitudes. I have tried several C# solutions, but none of them were accurate enough. But I found some Javascript code that gives out the exact result I'm looking for (.html). The problem is, when I turned the code into C#, it gives out a different result. Here are the pieces of code I see the problem in:

Javascript:

var a = 6378137;
var f = 1/298.257223563;
var e = Math.sqrt(f*(2-f));
var n = f / (2 - f);
var n2 = n*n, n3 = n*n2, n4 = n*n3, n5 = n*n4, n6 = n*n5;
var A = a/(1+n) * (1 + 1/4*n2 + 1/64*n4 + 1/256*n6);

And the C# code I converted myself:

var a = 6378137;
var f = 1 / 298.257223563;
var e = Math.Sqrt(f * (2 - f));
var n = f / (2 - f);
var n2 = n * n;
var n3 = n2 * n;
var n4 = n3 * n;
var n5 = n4 * n;
var n6 = n5 * n;
var A = a / (1 + n) * (1 + 1 / 4 * n2 + 1 / 64 * n4 + 1 / 256 * n6);

They should be identical, but the value of A is different. In Javascript it's 6367449.145823415 (what I want), but C# gives 6367444.6571225897487819833001. I could just use the Javascript code in my project, but conveniently Unity stopped supporting Javascript just last year.

The inputs are the same for both functions. What could be the issue here?

I'm doing a Unity project where I have the need to convert UTM coordinates to latitudes and longitudes. I have tried several C# solutions, but none of them were accurate enough. But I found some Javascript code that gives out the exact result I'm looking for (https://www.movable-type.co.uk/scripts/latlong-utm-mgrs.html). The problem is, when I turned the code into C#, it gives out a different result. Here are the pieces of code I see the problem in:

Javascript:

var a = 6378137;
var f = 1/298.257223563;
var e = Math.sqrt(f*(2-f));
var n = f / (2 - f);
var n2 = n*n, n3 = n*n2, n4 = n*n3, n5 = n*n4, n6 = n*n5;
var A = a/(1+n) * (1 + 1/4*n2 + 1/64*n4 + 1/256*n6);

And the C# code I converted myself:

var a = 6378137;
var f = 1 / 298.257223563;
var e = Math.Sqrt(f * (2 - f));
var n = f / (2 - f);
var n2 = n * n;
var n3 = n2 * n;
var n4 = n3 * n;
var n5 = n4 * n;
var n6 = n5 * n;
var A = a / (1 + n) * (1 + 1 / 4 * n2 + 1 / 64 * n4 + 1 / 256 * n6);

They should be identical, but the value of A is different. In Javascript it's 6367449.145823415 (what I want), but C# gives 6367444.6571225897487819833001. I could just use the Javascript code in my project, but conveniently Unity stopped supporting Javascript just last year.

The inputs are the same for both functions. What could be the issue here?

Share Improve this question edited Aug 16, 2018 at 14:20 Feras Al Sous 1,0831 gold badge12 silver badges25 bronze badges asked Aug 16, 2018 at 14:08 taiteilijaumbrataiteilijaumbra 991 silver badge8 bronze badges 3
  • Make sure your current culture(en-gb) is the same on client(browser) and server(c# code). – Ismail Yilmaz Commented Aug 16, 2018 at 14:14
  • @IsmailYilmaz culture has nothing to do with mathematical calculations. All numbers are litteral here (not somehow parsed from strings) – Pac0 Commented Aug 16, 2018 at 14:33
  • @Pac0 Thank you for informing me about this. – Ismail Yilmaz Commented Aug 16, 2018 at 14:38
Add a ment  | 

3 Answers 3

Reset to default 14

You have an issue in the last expression

var A = a / (1 + n) * (1 + 1 / 4 * n2 + 1 / 64 * n4 + 1 / 256 * n6);

In C# 1 / 4 * n2 is evaluated to 0 since 1 and 4 are considered as integers by default and integer division 1 / 4 gives 0. Same thing happens to 1 / 64 * n4 and 1 / 256 * n6. But in JavaScript there are only 64-bit floating point numbers, so 1 / 4 is evaluated to 0.25.

Possible workaround:

var A = a / (1 + n) * (1 + 1 / 4.0 * n2 + 1 / 64.0 * n4 + 1 / 256.0 * n6);

Now answers seem to be exactly the same.

Note: as @Lithium mentioned, you may find more elegant to add d rather than .0 to the end of the number to indicate it as double.

This likely has to do with the lack of handling for floating point numbers in javascript. C# handles it fine but not Javascript. If you would like to know more about some of the quirks in Javascript numbers check out the link below.

https://www.w3schools./js/js_numbers.asp

In your C# use all your variables as type double. These are twice as accurate as the standard C# floats which I assume it is using by default.

This should give you an identical answer.

Most languages have the concept of a single precision floating point number and a double precision floating point number (32 bit vs 64 bit). Due to JavaScript being a Non-strongly typed language I'm assuming it's picking up the fact that it should use 64 bits for your numbers whereas C# needs to be told that. You should also avoid var in C# wherever possible and use the actual type, to avoid situations like this :p

I'm doing a Unity project where I have the need to convert UTM coordinates to latitudes and longitudes. I have tried several C# solutions, but none of them were accurate enough. But I found some Javascript code that gives out the exact result I'm looking for (.html). The problem is, when I turned the code into C#, it gives out a different result. Here are the pieces of code I see the problem in:

Javascript:

var a = 6378137;
var f = 1/298.257223563;
var e = Math.sqrt(f*(2-f));
var n = f / (2 - f);
var n2 = n*n, n3 = n*n2, n4 = n*n3, n5 = n*n4, n6 = n*n5;
var A = a/(1+n) * (1 + 1/4*n2 + 1/64*n4 + 1/256*n6);

And the C# code I converted myself:

var a = 6378137;
var f = 1 / 298.257223563;
var e = Math.Sqrt(f * (2 - f));
var n = f / (2 - f);
var n2 = n * n;
var n3 = n2 * n;
var n4 = n3 * n;
var n5 = n4 * n;
var n6 = n5 * n;
var A = a / (1 + n) * (1 + 1 / 4 * n2 + 1 / 64 * n4 + 1 / 256 * n6);

They should be identical, but the value of A is different. In Javascript it's 6367449.145823415 (what I want), but C# gives 6367444.6571225897487819833001. I could just use the Javascript code in my project, but conveniently Unity stopped supporting Javascript just last year.

The inputs are the same for both functions. What could be the issue here?

I'm doing a Unity project where I have the need to convert UTM coordinates to latitudes and longitudes. I have tried several C# solutions, but none of them were accurate enough. But I found some Javascript code that gives out the exact result I'm looking for (https://www.movable-type.co.uk/scripts/latlong-utm-mgrs.html). The problem is, when I turned the code into C#, it gives out a different result. Here are the pieces of code I see the problem in:

Javascript:

var a = 6378137;
var f = 1/298.257223563;
var e = Math.sqrt(f*(2-f));
var n = f / (2 - f);
var n2 = n*n, n3 = n*n2, n4 = n*n3, n5 = n*n4, n6 = n*n5;
var A = a/(1+n) * (1 + 1/4*n2 + 1/64*n4 + 1/256*n6);

And the C# code I converted myself:

var a = 6378137;
var f = 1 / 298.257223563;
var e = Math.Sqrt(f * (2 - f));
var n = f / (2 - f);
var n2 = n * n;
var n3 = n2 * n;
var n4 = n3 * n;
var n5 = n4 * n;
var n6 = n5 * n;
var A = a / (1 + n) * (1 + 1 / 4 * n2 + 1 / 64 * n4 + 1 / 256 * n6);

They should be identical, but the value of A is different. In Javascript it's 6367449.145823415 (what I want), but C# gives 6367444.6571225897487819833001. I could just use the Javascript code in my project, but conveniently Unity stopped supporting Javascript just last year.

The inputs are the same for both functions. What could be the issue here?

Share Improve this question edited Aug 16, 2018 at 14:20 Feras Al Sous 1,0831 gold badge12 silver badges25 bronze badges asked Aug 16, 2018 at 14:08 taiteilijaumbrataiteilijaumbra 991 silver badge8 bronze badges 3
  • Make sure your current culture(en-gb) is the same on client(browser) and server(c# code). – Ismail Yilmaz Commented Aug 16, 2018 at 14:14
  • @IsmailYilmaz culture has nothing to do with mathematical calculations. All numbers are litteral here (not somehow parsed from strings) – Pac0 Commented Aug 16, 2018 at 14:33
  • @Pac0 Thank you for informing me about this. – Ismail Yilmaz Commented Aug 16, 2018 at 14:38
Add a ment  | 

3 Answers 3

Reset to default 14

You have an issue in the last expression

var A = a / (1 + n) * (1 + 1 / 4 * n2 + 1 / 64 * n4 + 1 / 256 * n6);

In C# 1 / 4 * n2 is evaluated to 0 since 1 and 4 are considered as integers by default and integer division 1 / 4 gives 0. Same thing happens to 1 / 64 * n4 and 1 / 256 * n6. But in JavaScript there are only 64-bit floating point numbers, so 1 / 4 is evaluated to 0.25.

Possible workaround:

var A = a / (1 + n) * (1 + 1 / 4.0 * n2 + 1 / 64.0 * n4 + 1 / 256.0 * n6);

Now answers seem to be exactly the same.

Note: as @Lithium mentioned, you may find more elegant to add d rather than .0 to the end of the number to indicate it as double.

This likely has to do with the lack of handling for floating point numbers in javascript. C# handles it fine but not Javascript. If you would like to know more about some of the quirks in Javascript numbers check out the link below.

https://www.w3schools./js/js_numbers.asp

In your C# use all your variables as type double. These are twice as accurate as the standard C# floats which I assume it is using by default.

This should give you an identical answer.

Most languages have the concept of a single precision floating point number and a double precision floating point number (32 bit vs 64 bit). Due to JavaScript being a Non-strongly typed language I'm assuming it's picking up the fact that it should use 64 bits for your numbers whereas C# needs to be told that. You should also avoid var in C# wherever possible and use the actual type, to avoid situations like this :p

本文标签: C and Javascript code calculations giving different resultsStack Overflow