Melee System errors (Unityscript) - system

Hi I'm new to coding and have been having problems with this code for months. No matter what I do there always seems to be an error that consistently pops up. Please help.
#pragma strict
var damage : int = 1;
var distance : float;
function update ()
{
var hit : raycasthit;
if (physics.raycast ( transform.position, transform.transformdirection ( vector3.forward , hit ); );
{distance = hit.distance; hit.transform.sendmessage ( "applydamage" , damage , sendmessageoptions.dontrequirereceiver); };
};

There are so many syntax and name errors.
First of all i think you should learn basics of programming and after that spesific language syntax then platform libraries.
2, SendMessageOptions.DontRequireReceiver
3, Physics.Raycast parameters are incorrect.
4, TransformDirection
5, RaycastHit
6, SendMessage
7, Update
Maybe below is more clean
#pragma strict
var damage : int = 1;
var distance : float;
function Update (){
var hit : RaycastHit;
if (Physics.Raycast( transform.position, transform.TransformDirection ( vector3.forward) , hit ){
distance = hit.distance;
hit.transform..SendMessage( "applydamage" , damage , SendMessageOptions.DontRequireReceiver);
};
};
And also please always look Unity Documentation for understanding logic and resolving syntax errors.

Related

Infer type information inside if-constexpr

I want to explore the feature of if-constexpr and try to figure out type information at compile-time.
For this purpose, I write the following code.
I expect that printTypeInfo function will return 4 for x and 3.24 for y. However, it gives me 3.1 and 3.24.
I don't know where goes wrong.
I wish decltype will infer the type to int, but it seems infer to double.
When I replace decltype with following code, it works.
enter image description here
decltype(value) in your example will always be const T&, since that's your parameter type. References aren't integral, so you always add 0.1. You could use std::remove_reference_t to get the underlying type:
auto printTypeInfo(const auto& value) {
if constexpr (std::is_integral_v<std::remove_reference_t<decltype(value)>>) {
return v + 1;
} else {
return v + 0.1;
}
}

What numbering system is used by terminal when outputing to a script

I'm currently learning C programming, and wrote a script similar to one I've written in Python before. My goal is to learn how to pass input to an application and have it process the data I pass it.
The problem I'm having now is the feedback my application is giving me. I wrote a simple application to read keyboard input and give 1 of 3 responses based on what input I give it. The code is as follows:
/*Input test.*/
#include<stdio.h>
#include<stdlib.h>
char input;
const int option_a = 1;
const int option_b = 2;
int main()
{
printf("Lets get started! a for on or b for off?\n");
while(1)
{
input = getchar();
if(input == option_a)
{
printf("We're on.!\n");
}
else if(input == option_b)
{
printf("Off we go.\n");
}
else
{
printf("Excuse me, but I didn't get that.\n");
}
}
return 0;
}
Simply option_a is me pressing the 1 key on keyboard, and option_b is key 2. When I press these keys, or any key for that matter, the application will always go to the "else" portion of the decision tree. Saying that, it's clear to me that, and I'll say with a lack of a better term/expression, that my application isn't seeing my input as the decimal number 1 or 2.
From the terminal, what is the structure of the data I'm sending to my application, or simply put, what does my 1or 2 "look" like to my application?
When you are taking input with getchar() you are getting an char value. But you are comparing it with an integer. Instead of comparing with the integer, you can compare the input with corresponding characters. For example, use
const char option_a = '1';
const char option_b = '2';
I believe you want to find the American Standard Code for information interchange(ASCII) table. 0 = 48, 1 = 49, 2 = 50,...I had the same problem while working with arduino’s serial monitor, and it should all be covered by the same standard.

CUnsignedChar has no subscript members

I am getting the error Type 'CUnsignedChar?' has no subscript members which produces a lot of results in stackoverflow however I can't seem to utilise any of the other available answers for my example. Its clearly a casting issue but I don't not see how to overcome it
I am doing a obj-c to swift conversion and I have a variable being set up as follows
var bBuff1 = [CUnsignedChar](repeating: 0, count: Int(256*512))
var backGreyBuffer : CUnsignedChar = bBuff1[0]
//..
//..
var backGreyBufferOffset : Int = localTexOffset * 512
var grey_val = 0
self.backGreyBuffer[Int(backGreyBufferOffset)]! = grey_val; //Subscript error here
This is the obj-c code that uses in-outs.
unsigned char bBuff1[256*512];
unsigned char *backGreyBuffer = &bBuff1[0];
//..
grey_val = 0;
backGreyBuffer[backGreyBufferOffset] = grey_val;
Any suggestions about the right direction would be great.
I noticed that only a small change is needed in your code. You should make backGreyBuffer a pointer:
var bBuff1 = [CUnsignedChar](repeating: 0, count: Int(256*512))
var backGreyBuffer = UnsafeMutablePointer(mutating: bBuff1)
// ....
var backGreyBufferOffset = localTexOffset * 512
backGreyBuffer[backGreyBufferOffset] = grey_val

Raw Input mouse lastx, lasty with odd values while logged in through RDP

When I attempt to update my mouse position from the lLastX, and lLastY members of the RAWMOUSE structure while I'm logged in via RDP, I get some really odd numbers (like > 30,000 for both). I've noticed this behavior on Windows 7, 8, 8.1 and 10.
The usFlags member returns a value of MOUSE_MOVE_ABSOLUTE | MOUSE_VIRTUAL_DESKTOP. Regarding the MOUSE_MOVE_ABSOLUTE, I am handling absolute positioning as well as relative in my code. However, the virtual desktop flag has me a bit confused as I assumed that flag was for a multi-monitor setup. I've got a feeling that there's a connection to that flag and the weird numbers I'm getting. Unfortunately, I really don't know how to adjust the values without a point of reference, nor do I even know how to get a point of reference.
When I run my code locally, everything works as it should.
So does anyone have any idea why RDP + Raw Input would give me such messed up mouse lastx/lasty values? And if so, is there a way I can convert them to more sensible values?
It appears that when using WM_INPUT through remote desktop, the MOUSE_MOVE_ABSOLUTE and MOUSE_VIRTUAL_DESKTOP bits are set, and the values seems to be ranging from 0 to USHRT_MAX.
I never really found a clear documentation stating which coordinate system is used when MOUSE_VIRTUAL_DESKTOP bit is set, but this seems to have worked well thus far:
case WM_INPUT: {
UINT buffer_size = 48;
LPBYTE buffer[48];
GetRawInputData((HRAWINPUT)lparam, RID_INPUT, buffer, &buffer_size, sizeof(RAWINPUTHEADER));
RAWINPUT* raw = (RAWINPUT*)buffer;
if (raw->header.dwType != RIM_TYPEMOUSE) {
break;
}
const RAWMOUSE& mouse = raw->data.mouse;
if ((mouse.usFlags & MOUSE_MOVE_ABSOLUTE) == MOUSE_MOVE_ABSOLUTE) {
static Vector3 last_pos = vector3(FLT_MAX, FLT_MAX, FLT_MAX);
const bool virtual_desktop = (mouse.usFlags & MOUSE_VIRTUAL_DESKTOP) == MOUSE_VIRTUAL_DESKTOP;
const int width = GetSystemMetrics(virtual_desktop ? SM_CXVIRTUALSCREEN : SM_CXSCREEN);
const int height = GetSystemMetrics(virtual_desktop ? SM_CYVIRTUALSCREEN : SM_CYSCREEN);
const Vector3 absolute_pos = vector3((mouse.lLastX / float(USHRT_MAX)) * width, (mouse.lLastY / float(USHRT_MAX)) * height, 0);
if (last_pos != vector3(FLT_MAX, FLT_MAX, FLT_MAX)) {
MouseMoveEvent(absolute_pos - last_pos);
}
last_pos = absolute_pos;
}
else {
MouseMoveEvent(vector3((float)mouse.lLastX, (float)mouse.lLastY, 0));
}
}
break;

Make "int" and "float" work in swift playground

so i am just starting to teach myself a bit ios and am an absolute amateur (please dont be too harsh and forgive questions, of which to you the answers might be obvious). so i wanted to get this following code to work:
var int bildBreite;
var float bildHoehe, bildFlaeche;
var bildBreite = 8;
var bildHoehe = 4.5;
var bildFlaeche = bildBreite * bildHoehe;
but get the mistakes as seen on this screenshot here (link: http://prntscr.com/6cdkvv )
concerning the screenshot: do not be confused with the "6" on the very right of line 10 , this was because before the "4.5" it was saying "6".
also, this is a xcode playground and the code i am trying out is from an objective c beginners guide, if that helps. (this whole swift and objective c mix is too confusing)
i would really appreciate an answer and if you have more questions to figure it out properly, i am happy to answer as quick as possible.
all the best
leo
Let's go over the code:
var int bildBreite;
var float bildHoehe, bildFlaeche;
You are trying to set the types of some variables in a non-Swift way. In order to declare variables of a specific type, you must add the type after the variable name: var bildBreite: Int Also note that class and struct names in Swift start with a capital letter, as #blacksquare noted.
var bildBreite = 8;
var bildHoehe = 4.5;
Here, it looks like you are trying to re-declare the variables. This isn't allowed.
var bildFlaeche = bildBreite * bildHoehe;
You are trying to multiply data between two different number types. This is not allowed in Swift. In order to actually do any math work, you would need to convert one to the same type as the other.
This version should work:
var bildBreite: Int;
var bildHoehe: Float;
var bildFlaeche: Float;
bildBreite = 8;
bildHoehe = 4.5;
bildFlaeche = Float(bildBreite) * bildHoehe;
The following code is a more concise version of your code:
// Inferred to be an Int
var bildBreite = 8
// This forces it to a Float, otherwise it would be a Double
var bildHoehe: Float = 4.5
// First, we convert bildBreite to a Float, then times it with Float bildHoehe.
// The answer is inferred to be a Float
var bildFlaeche = Float(bildBreite) * bildHoehe
int should be Int and float should be Float. Class and primitive names in Swift always begin with a capital letter.
To declare a variable in Swift, you do this:
var variableName: Int = intValue
Or you can let the compiler decide the type for you:
var variableNameA = 1
var variableNameB = 1.5
If you want to declare a variable and then assign it later you use an optional or an implicitly unwrapped optional:
var variableNameA: Int?
var variableNameB: Int!
variableNameA = 1
variableNameB = 2
var variableNameC = variableNameA! + variableNameB
As you can see from the last example, variableNameA is optional (meaning it can have a null value, so it must be unwrapped before usage using the ! operator; an implicitly unwrapped optional like variableNameB doesn't need to be unwrapped, but it's less safe and should be used with caution.
Note also that you only use var when you first declare a variable.
Good luck!
var bildBreite:Int = 8
var bildHoehe:Float = 4.5
var bildFlaeche:Float = bildBreite * bildHoehe
You dont need semi colons. You need to not try to apply java or any other language to this learn the syntax and you will be good