Im struggling with a script we are writing for a customer. Each night, we get a csv file from an Oracle database containing users we create as contacts in Active Directory.
So, I read in the CSV and the existsing contacts from AD, and do a compare to find those that don't already exist in ad. For these, we create them.
Now, if this was a few hundred users it would be fine, but this file is currently containing 52000 users, and my script just doesn't work fast enough. Right now, it handles about 5000 users per 30 minutues or so.
So, to make it as effective as possible, I'm looking for ways to make array lookups as effective as possible. Each user has a unique number (membernumber), and so I was thinking that maybe sorting both tables on this number and make the lookup stop looking when the first instance is found woudl speed things up (as in don't keep looking through all 52000 lines if you found what you needed in line 3).
I know this is general, but any pointers on how to do effective lookups/comparisons for large amounts of data would be so helpful.
any pointers appreciated.
Here is another superfast way:
[System.Collections.ArrayList]$list = 1..300000
[System.Collections.ArrayList]$list = 1..300000 # this would be your large list of memberids
$list.Contains($idcompare) # this would lookup an id in your arraylist and return true/false
AH, didnt know about the compare-object, ill check that out! Thanks!
Just wanted to let you know, the compare-object is EXACLTY what I need. This is a script that runs each night, and the changes from night to night are small (0.1-1% maybe), so Ill pipe everything through a compare first (compared to last nights run) and then have a much smaller amount of data I can perform AD manipulations with.
Again, thanks so much. Saved my day!
pleasure... glad it works for you so well!