Aller au contenu

Photo

Homebrew scripts vs Internal Function calls


  • Veuillez vous connecter pour répondre
46 réponses à ce sujet

#1
Failed.Bard

Failed.Bard
  • Members
  • 774 messages
Starting this as a new topic, rather than cluttering up the Listener one.


[quote]Lightfoot8 wrote...

[quote]Zarathustra217 wrote...

Sorry for the off topic, but I find this very much interesting. I figure this also entails that if you have to look for a handful spell effects on a creature, it is faster to use GetHasSpellEffect several times than to loop through each effect on the creature and check for the spell ID of that effect. Normally, I would expect latter solution to be most effective (one search vs. multiple) but given that the GetHasSpellEffect is an internal function, it would be far faster than the scripted loop.

Thoughts?

[/quote]

Yep,  That is what is being said.  


[quote]Failed.Bard wrote...

  A well written homebrew function compiled with one of the better compilers (like is integrated into Virusman's toolset extender) will be faster than the equivalent likely 95% of the time.  Every single replacement function I've written and compared to the bioware default has been faster, even if only by a few percent over thousands of cycles.

  Likely this deserves its own topic though, so people could get more in depth into replacement function comparisons.
[/quote]

Yes, Home brew vs Home brew can be compiled better using a better compiler.  

Try and rewrite an internal function with nwscript, something like GetItemPossessedBy, and see if you get any gain. [/quote]


  I'd realized after I made my post that most of the functions I've replaced are ones that were in includes, which weren't the sort that Lightfoot8 had been referring to in his original post on it anyways.

  Regardless, since I'm always curious about this sort of thing, I did a bit of testing on it, using everyone's favourite inefficient function: GetNearestObjectByTag.

  This one is fairly cycle heavy.  I'd originally planned to run the tests at 1000 loops of each script variant, but it was giving TMI errors already after 23 loops using 100 equally tagged placeables.  I ran it at loops of 20 instead.

The main command function:

[quote]
void main()
{
object oPC = GetPlaceableLastClickedBy();

int i;
for (i; i< 20; i++)
{
ExecuteScript("test_gnobt_bio", oPC);
ExecuteScript("test_gnobt_1", oPC);
ExecuteScript("test_gnobt_2", oPC);
}
}[/quote]

"test_gnobt_bio"
[quote]
void main()
{
object oTest = GetNearestObjectByTag ("x2_easy_Barrel");
}[/quote]

"test_gnobt_1"
[quote]


object GetNearestObjectByTag_1 (string sTag, object oSource = OBJECT_SELF)
{
object oNearest = OBJECT_INVALID;
int i;
float fNearest = 3000.0, fTest;

object oArea = GetArea (oSource);

object oTest = GetObjectByTag (sTag);
while (GetIsObjectValid (oTest))
{
oTest = GetObjectByTag (sTag, i);
if (GetArea (oTest) == oArea)
{
fTest = GetDistanceBetween (oSource, oTest);
if (fTest < fNearest)
{
oNearest = oTest;
fNearest = fTest;
}
}
oTest = GetObjectByTag (sTag, ++i);
}
return oNearest;
}

void main()
{
object oTest = GetNearestObjectByTag_1 ("x2_easy_Barrel");
}[/quote]

"test_gnobt_2"


[quote]
object GetNearestObjectByTag_2 (string sTag, object oSource = OBJECT_SELF)
{
object oTest, oNearest = OBJECT_INVALID;
float fNearest = 3000.0, fTest;

object oArea = GetArea (oSource);

oTest = GetFirstObjectInArea (oArea);
while (GetIsObjectValid(oTest))
{
if (GetTag (oTest) == sTag)
{
fTest = GetDistanceBetween (oSource, oTest);
if (fTest < fNearest)
{
oNearest = oTest;
fNearest = fTest;
}
}
oTest = GetNextObjectInArea (oArea);
}
return oNearest;
}

void main()
{
object oTest = GetNearestObjectByTag_2 ("x2_easy_Barrel");
}[quote]


  The results were interesting, not in that the Bioware one was faster, I expected that with the higher placeable count searches, but in how wide a range the script profiler was giving doing the exact same tests when it was only 10 placeables with that tag to search through.

  Here's the numbers I got:
[quote]
100 equally tagged placeables:
test_gnobt_bio 200 - 140
test_gnobt_1    200 - 173
test_gnobt_2    200 - 219

50 equally tagged placeables:
test_gnobt_bio 200 - 77
test_gnobt_1    200 - 93
test_gnobt_2    200 - 141

20 equally tagged placeables:
test_gnobt_bio 200 - 46
test_gnobt_1    200 - 79
test_gnobt_2    200 - 94

10 equally tagged placeables:
test_gnobt_bio 200 - 92
test_gnobt_1    200 - 31
test_gnobt_2    200 - 62

test_gnobt_bio 200 - 32
test_gnobt_1    200 - 63
test_gnobt_2    200 - 61

test_gnobt_bio 200 - 63
test_gnobt_1    200 - 63
test_gnobt_2    200 - 63

test_gnobt_bio 200 - 62
test_gnobt_1    200 - 15
test_gnobt_2    200 - 94

test_gnobt_bio 200 - 64
test_gnobt_1    200 - 47
test_gnobt_2    200 - 91
[/quote]

  This was in a small hakless testing module as well, and it seemed like overhead had a bigger effect on script execution time than the scripts themselves did.

  In the example with these scripts, I expect areas with huge numbers of placeables, when searching them for 1 or 2 like tagged objects, the homebrew would likely be faster, but in most normal circumstances not.


  I think I'll take a look through some more of the default functions to see if there are any others that might be worth testing against replacement functions.
 Certainly anything in an include file can be done more efficiently, but I'm curious if there are some of the internal ones that could be as well.

edit: fixed an error I noticed in the test scripts posted.  Forgot to assign fTest, so the first object found always detected as the nearest.  It wouldn't have affected the execution speed comparison tests though.

Modifié par Failed.Bard, 21 juillet 2012 - 03:12 .


#2
Failed.Bard

Failed.Bard
  • Members
  • 774 messages
  I found another laggy internal routine to compare, StringToInt.  Admittedly, there's likely a more efficient method than the one I came up with for it , but the internal function came out well ahead in this one also.

  I did find out one interesting thing doing this comparison though, in the case of a mixed string, numbers and letters, it reads it from left to right stopping at the first non-number.  Mine reads right to left stopping at the first non-number.

  The scripts I used for  comparison:
The main function caller:

void main()
{
int i;
for (i; i < 250; i++)
{
ExecuteScript ("test_s2i_bio", OBJECT_SELF);
ExecuteScript ("test_s2i_1", OBJECT_SELF);
}
}



void main()
{
int nVal = StringToInt ("889adfiy224");
}



"test_s2i_1"

int _StringToInt(string sString)
{
float fVal;
int nTest, i;
int nLength = GetStringLength (sString) - 1;

for (i; i <= nLength; i++)
{
nTest = FindSubString ("0123456789", GetSubString (sString, nLength - i, 1));
if (nTest == -1) return FloatToInt (fVal);
else fVal += IntToFloat (nTest) * pow(10.0, IntToFloat (i));
}
return FloatToInt (fVal);
}

void main()
{
int nVal = _StringToInt ("889adfiy224");
}


and the results:

string - "889adfiy224" 
test_s2i_bio 5000 - 606
test_s2i_1    5000 - 894

test_s2i_bio 5000 - 439
test_s2i_1    5000 - 918

string - "1243890734"
test_s2i_bio 5000 -  580
test_s2i_1    5000  - 1419


  Based on those results, I think the internal function must check left to right through the whole string, find the spot the numbers stop, then work backwards converting it to a number.
  Otherwise the signifigant increase in time on mine (to be expected, more operations == more time) with the extra numbers compared to the comparable numbers in the internal routine doesn't make sense.

  It does look like The.Gray.Fox was right though, even a slow internal routine is faster than an external one.  They might be comparable in some scenarios, but overall not so much so.
  That's a bit disheartening to me, since I was considering rewriting the whole spell engine for my next scripting project.  It looks like there's be an extra few fractions of a millisecond for every internal function replaced to contend with in any sort of large scripted system overhaul.

#3
Shadooow

Shadooow
  • Members
  • 4 465 messages

The results were interesting, not in that the Bioware one was faster, I
expected that with the higher placeable count searches, but in how wide a
range the script profiler was giving doing the exact same tests when it
was only 10 placeables with that tag to search through.

Yes thats the basic issue and reason why profiling isn't reliable. Neither using loop with 1000 iterations is reliable as the script speed is dependant also on anything else running in the same time. Basically there is absolutely no reason to think the home brew function can be faster than internal, since... you can use only math and internal functions again.

#4
Zarathustra217

Zarathustra217
  • Members
  • 221 messages
Out of curiosity - are you using the Bioware profiler or the NWNx?

#5
Failed.Bard

Failed.Bard
  • Members
  • 774 messages
The bioware one. It's a nuisance pulling my little server down just to profile a script, though since it's empty and just used for testing most of the time it wouldn't matter much. It'd be interesting to run both at the same time and compare the results though.

#6
Shadooow

Shadooow
  • Members
  • 4 465 messages

Failed.Bard wrote...

The bioware one. It's a nuisance pulling my little server down just to profile a script, though since it's empty and just used for testing most of the time it wouldn't matter much. It'd be interesting to run both at the same time and compare the results though.

Problem is that the results from nwnx profilers are similarry inconsistent. You won't get any tangible results from either profiler. That is, at least nwnx_profiler, didnt tried nwnx_time.

#7
Failed.Bard

Failed.Bard
  • Members
  • 774 messages

ShaDoOoW wrote...

The results were interesting, not in that the Bioware one was faster, I
expected that with the higher placeable count searches, but in how wide a
range the script profiler was giving doing the exact same tests when it
was only 10 placeables with that tag to search through.

Yes thats the basic issue and reason why profiling isn't reliable. Neither using loop with 1000 iterations is reliable as the script speed is dependant also on anything else running in the same time. Basically there is absolutely no reason to think the home brew function can be faster than internal, since... you can use only math and internal functions again.


  Some homebrew functions can definately be faster than the internal equivalents, but only in certain situations.

  If GNOBT does loop through all the objects in the area, which it seems to, and you're searching for a unique tagged object in an area with hundreds of placeables, it'd be far less efficient than a homebrew using the GOBT list and an area check.  Unless you know that's going to be the situation ahead of time though, on average you'd be wasting CPU cycles using a replacement function.

#8
Shadooow

Shadooow
  • Members
  • 4 465 messages
Hmm right, thats true. Though Im not sure whether is it home brew related actually. Isnt this rather GOBT related?

#9
Failed.Bard

Failed.Bard
  • Members
  • 774 messages

ShaDoOoW wrote...

Hmm right, thats true. Though Im not sure whether is it home brew related actually. Isnt this rather GOBT related?


   Yup, but it's a case where a home brew function could outperform an internal one, by handling a task using an alternate method.

  Say you give each area a specific respawn waypoint, that's the same tag for every area. GNOBT would be better if you have more areas than average placeables in an area.  If you tend to have a huge neumber of placeables per area, but not many of them, than the homebrew getting them by tag and comparing area will be faster.

  Granted, it's a bad example because tagging the waypoint RESPAWN_ + area tag would be more efficient in every circumstance, since each tag would be unique and GOBT would get the right one instantly.


  In the case of StringToInt, having an alternate method that checks for numbers at the end of a mixed string instead of the beginning could be useful, but it's certainly not needed as a replacement for complete numerical strings.
  A wrapper that ran that function if StringToInt returned a zero, to try to confirm it was a zero and not an error and to check against an empty string could be handy, but it would be doubly inefficient and most times completely unneeded.


  Still, sometimes it's good to try out some of these things yourself just to double-check someone elses results, or just to better understand how something was done, even if it doesn't lead to anything important.

#10
Zarathustra217

Zarathustra217
  • Members
  • 221 messages
Indeed. I actually checked out the scripts you made and even in our large module with about 900 areas, using GetObjectByTag came out much much faster.

Thanks for this, the entire GOBT vs. GNOBT had completely escaped me and I was still on the old band wagon. Now to go rewrite the optimizations.

#11
Failed.Bard

Failed.Bard
  • Members
  • 774 messages

Zarathustra217 wrote...

Indeed. I actually checked out the scripts you made and even in our large module with about 900 areas, using GetObjectByTag came out much much faster.

Thanks for this, the entire GOBT vs. GNOBT had completely escaped me and I was still on the old band wagon. Now to go rewrite the optimizations.


It was actually Mavrixio from Sinfar that first pointed it out.  It apparently used to be the other way around, but it works out GOBT is in the same speed range as GetLocal*, and even faster when the object has many vars on it.  If I can find that old post I'll edit to add the link to that discussion, it was interesting.

Edit:  Here it is.

Modifié par Failed.Bard, 16 juillet 2012 - 10:30 .


#12
Zarathustra217

Zarathustra217
  • Members
  • 221 messages
We ought to work together to profile and document every central NWN scripting function for performance. It seems so unpredictable - and apparently, we can't even be sure that what the Bioware people have said in the past still sticks.

#13
Failed.Bard

Failed.Bard
  • Members
  • 774 messages
    It's unfortunate the web version of the Lexicon isn't still being updated. There's some mis-information in some of the functions there, on things that were changed or fixed in later patches.

  The idea of a function performance index of sorts is an interesting one. Having an idea of the average overhead of the various functions might lead people to explore alternate methods of doing things.
 
  Expirimenting is always a good thing. As old as NWN is, people are still finding new things that it can do, or in the case of the extenders, be made to do.

Modifié par Failed.Bard, 17 juillet 2012 - 04:17 .


#14
Shadooow

Shadooow
  • Members
  • 4 465 messages

Failed.Bard wrote...

    It's unfortunate the web version of the Lexicon isn't still being updated. There's some mis-information in some of the functions there, on things that were changed or fixed in later patches.

Yes but you can write what you find out into the thread for lexicon on these forums. If someone in future decide to make update it will make things easier.

#15
Zarathustra217

Zarathustra217
  • Members
  • 221 messages
I think we would be better off with something more wiki-like, at least as a supplement. I've personally often found quirks and oddities I would like to have shared but didn't know where to.

#16
acomputerdood

acomputerdood
  • Members
  • 219 messages

Zarathustra217 wrote...

I think we would be better off with something more wiki-like, at least as a supplement. I've personally often found quirks and oddities I would like to have shared but didn't know where to.


well......

what would be the legal/copyright ramifications of mirroring the data?

i just did a recursive wget of the site and now have an exact mirror of all their data.  i'm sure it wouldn't be difficult to write some scripts to dump it all into a wiki.  provided doing that would be kosher.  :D



EDIT:  ahhh, i now see there's a newer version on the vault.  i'm also assuming then that there's no reason why it can't be dumped into a wiki.

Modifié par acomputerdood, 18 juillet 2012 - 12:37 .


#17
Shadooow

Shadooow
  • Members
  • 4 465 messages

acomputerdood wrote...

Zarathustra217 wrote...

I think we would be better off with something more wiki-like, at least as a supplement. I've personally often found quirks and oddities I would like to have shared but didn't know where to.


well......

what would be the legal/copyright ramifications of mirroring the data?

i just did a recursive wget of the site and now have an exact mirror of all their data.  i'm sure it wouldn't be difficult to write some scripts to dump it all into a wiki.  provided doing that would be kosher.  :D

speaking of The Krit's NWN Wiki, the formatting is quite bad here. The Lexicon has great search engine (functions in left menu) and nice formatting. The Krit and Whizard are adding script functions into wiki but its weird. The NWN Wiki doesnt seems to me a good placec for this.

Since Ive helped 1.69 lexicon team, I know that to make a html page is quite easy. Not sure how to compile windows help page but thats not needed - only thing what we need is an access to the lexicon website. Then it will be easy to change/update things.

#18
acomputerdood

acomputerdood
  • Members
  • 219 messages
i've put some effort into this, and have started importing the lexicon pages into a wiki:

http://www.dalakora....pecial:AllPages

i'm not 100% happy with the formatting - i used html2wiki (a perl module) to do the translations. i've done some scripting to clean up the pages some, but i think it needs more.

ideally, some of the links need to be fixed up. anyway, just a first cut.


PS: oh, and don't get ambitious and try to update any of the pages. i'll likely blow the entire thing away once i'm done mucking with the formatting.

Modifié par acomputerdood, 18 juillet 2012 - 06:48 .


#19
acomputerdood

acomputerdood
  • Members
  • 219 messages
i'm uploading my fixed up pages right now, and i think they look pretty good! there's some work to be done editing the sidebar and stuff to make it look more friendly, but the wiki page content itself is good. any commments?

i realize now i should probably post in the actual lexicon thread and stop hijacking this one.

#20
Shadooow

Shadooow
  • Members
  • 4 465 messages
sorry but this is even worse solution that writing this into The Krit's NWN wiki

formatting is terrible, searching functions too and website adress is uknown - i mean the current scripters all know lexicon and nwn wiki, unless they dont stop by here they never find out that the lexicon is further updated

we dont need a "wiki" engine actually - we can discuss the knowledge and remarks about functions on these forums, as long as every contributor write others what he changed, there won't be any issues

Modifié par ShaDoOoW, 19 juillet 2012 - 10:57 .


#21
Failed.Bard

Failed.Bard
  • Members
  • 774 messages
With the way the Bioware search function works, the key to storing it here would be in ensuring that any topics on specific functions have that function name in the thread title. Barring a new moderator getting appointed so we could get a thread stickied for it, that is.

#22
acomputerdood

acomputerdood
  • Members
  • 219 messages

ShaDoOoW wrote...

formatting is terrible


article formatting?  i'm still working on fixing them up, but i don't think it's THAT bad.  certainly not "terrible".  take a look at:

http://www.dalakora....stSpellAtObject

to me it looks quite similar to the lexicon itself.

i'm assuming you're referring to the wiki formatting, which i'm still working on.  the wiki categories don't work, but the "categories" section from the lexicon are still quite functional.  it shouldn't be difficult to parse those out and wiki-ize them correctly.

i did a little work setting up subcategories and you can see the beginnings of them listed in tree view on the left panel.  there's no reason why that effort can't be finished so that it mimics the lexicon.


searching functions too


searching works much better now that i finished renaming the function pages to the actual functions and not the original filenames.  still have many other categories to go, but once that's completed and all the junk is blown away i think it should work just fine.

and website adress is uknown


well of course it is!  i just set the page up this morning.  i don't own the DNS records for the lexicon so there's no way i can make it better known.  if something actually took off here we could consider an url that makes sense, but for my testing and proof of concept purposes, there's nothing wrong with my link.


i mean the current scripters all know lexicon and nwn wiki, unless they dont stop by here they never find out that the lexicon is further updated


this isn't a problem i'm trying to solve.  people were griping that the lexicon wasn't being updated, and a wiki is a perfect solution for that.

we dont need a "wiki" engine actually - we can discuss the knowledge and remarks about functions on these forums, as long as every contributor write others what he changed, there won't be any issues


i think the forums are a horrendous place to expect developers to wade through to find all the updates to the lexicon.  even with the stickied thread, it just doesn't scale.

if the lexicon were originally a wiki, then this entire issue wouldn't even exist.

#23
Shadooow

Shadooow
  • Members
  • 4 465 messages

acomputerdood wrote...

we dont need a "wiki" engine actually - we can discuss the knowledge and remarks about functions on these forums, as long as every contributor write others what he changed, there won't be any issues


i think the forums are a horrendous place to expect developers to wade through to find all the updates to the lexicon.  even with the stickied thread, it just doesn't scale.

if the lexicon were originally a wiki, then this entire issue wouldn't even exist.

you misunderstood me. Point is that using the wiki engine is needless - we can simply copy the lexicon pages to new site. 

Then few editors could do it using this forum for discussion and notification. Using project forums for example.

Althought best would be to gain access to current site ftp so we would not have to redirect scripters to new site. Do we know that the current lexicon site is abadoned and lost? If not there is no point of mirroring this away yet, first lets make some errands to contact site admin/owners.

Modifié par ShaDoOoW, 19 juillet 2012 - 05:33 .


#24
acomputerdood

acomputerdood
  • Members
  • 219 messages
here's my wget mirror of the lexicon:

http://www.dalakora....nwnlexicon.com/

but i've seen conflicting reports of how up to date it is. people here say it's only 1.68, but on the vault page for 1.69 it says that the lexicon was updated to include its changes.

for good measure, i broke out the vault's offline version for 1.69 back into html and uploaded it here as well:

http://www.dalakora....on-1.69/output/


and yes, you're right that the simplest would be to re-host the lexicon (or get access to the old) and have some admins in charge of updating it. that just opens the possibility of running into the same situation we have here.

anyway, i don't know how abandoned it is. all i've seen are posts claiming the owner isn't responding. don't know if anybody has checked the whois records to attempt contacting the person who owns the dns records or whatever.

#25
Shadooow

Shadooow
  • Members
  • 4 465 messages

acomputerdood wrote...

and yes, you're right that the
simplest would be to re-host the lexicon (or get access to the old) and
have some admins in charge of updating it. that just opens the
possibility of running into the same situation we have here.

this can happen that way or another - the site is yours, if you left the community and stop paying the site - we are all screwed again, until then as long as you give ftp access to the part of your web with lexicon to the well known scripters who ask you we are fine.

In this regards, the lexicon would be more safe under neverwinter.info website I think as its paid in advance by the NWN community, The Amethyst Dragon, do you read this? Could you mirror the lexicon there?

acomputerdood wrote...

anyway, i don't know how abandoned it is. all i've seen are posts claiming the owner isn't responding. don't know if anybody has checked the whois records to attempt contacting the person who owns the dns records or whatever.

yes ive heard the same but thought maybe the situation changed throrough the time, as the website is still there... But since the lexicon is not updated to 1.69 I guess the situation is still the same.