Scott Hanselman

Hanselminutes Podcast 23 - Scrum and Scrum Resources

July 09, 2006 Comment on this post [1] Posted in Podcast | ASP.NET | Subversion | XML | Tools
Sponsored By

HanselminutesMy twenty-third Podcast is up. This episode is about Scrum, an agile product management methodology.

There's a number of resources we talked about but there's dozens we missed for lack of time (and knowledge!) 

UPDATE: A review of this Podcast from Chris Chapman. We got it "kind-of right" which isn't too bad, IMHO. Be sure to get the full story and as always, read and read and read for yourself. Thanks Chris! Also, take a look at his History of Scrum.

However, listener and Scrum expert Howard van Rooijen (pronounced Royen) has put together a list of Scrum-related resources for us. Howard also let me know about an upcoming Certified ScrumMaster course taught by Mike Cohn on September 26-27, 2006 and an Agile Estimating and Planning course on September 28, 2006 in London (http://www.mountaingoatsoftware.com/)

  • Scrum for Team System - http://tinyurl.com/ztxrf - I know you (Scott) use a Subversion / CC.NET / rubber bands and voodoo environment, but we wrote a Scrum add-in for Team Foundation Server (with Ken Schwaber) and it's now available for download for free.
  • Scrum for Team System Process Guidance - http://tinyurl.com/jfhnf - One of the decisions we made early on was to separate the process guidance from the implementation so non-TFS folks could still benefit from the help and process guidance that we and Ken came up with. The site is a starter kit for people who want to learn more about Scrum - the written content is complemented by some videos of Ken describing the history of the Agile movement and some of the key aspects of the Scrum process
  • Ken Schwaber's Top Tips - http://tinyurl.com/e5ojr - a series of short videos of Ken talking about Agile & Scrum and a recording of one of Ken's workshops
  • Agile Software Development with Scrum Podcast Series - Scrum FAQ - http://tinyurl.com/hflnd - I was fortunate enough to be able to spent a couple of hours with Ken Schwaber last autumn and we recorded a series of podcasts where Ken answered some of the most frequently asked questions around Scrum.
  • A really good talk by Ken Schwaber - recorded by IT Conversations
  • Visual Studio 2005 SDK and Scrum - http://tinyurl.com/o5sjd - a post about the Microsoft Visual Studio 2005 SDK team's use of Scrum (and their channel9 video)
  • Scrum Development Yahoo Group - http://tinyurl.com/fnhfj - The place to go if you want to ask any question relating to Scrum - all the Luminaries hang out there.
  • Agile Alliance:  http://tinyurl.com/kbg28 - The home of Agile Methodologies - lots of good resources, whitepapers, research papers, articles, events and newsletters.
  • The Agile Manifesto - http://tinyurl.com/dyg4f - the core tenets of all the Agile Methodologies
  • Agile Software Development with Scrum - http://tinyurl.com/egmmf - The original book about learning Scrum - more about the processes and how to implement them. Less case studies than the Microsoft Press "Agile Project Management with Scrum"
  • Lean Software Development An Agile Toolkit - http://tinyurl.com/zfwl3 - this has become de facto reading at Conchango (every consultant has been given a copy) - the Lean Principles: <http://www.poppendieck.com/>  (also http://codebetter.com/blogs/darrell.norton/articles/50341.aspx ) are lessons learnt from the Lean Manufacturing process.
  • The 7 Wastes of Software Development -  This is something all developers should be aware of.
  • Implementing Lean Software Development : From Concept to Cash - http://tinyurl.com/f3bvt - Mary Poppendieck's follow up to "Lean Software Development: an Agile Toolkit"
  • User Stories Applied: For Agile Software Development - http://tinyurl.com/zsotm - Mike Cohn: <http://www.mountaingoatsoftware.com/> , one of the founders of the Agile Alliance - book about User Stories - a nice approach to defining requirements that can be used as a mechanism for defining your Product Backlog.
  • Agile Estimation and Planning - http://tinyurl.com/hsulp - building on the work done in User Stories Applied, Mike Cohn's book about how to estimate and plan Agile Projects. Excellent stuff.

Thanks to Howard for the links!

We're listed in the iTunes Podcast Directory, so I encourage you to subscribe with a single click (two in Firefox) with the button below. For those of you on slower connections there are lo-fi and torrent-based versions as well.

Subscribe to my Podcast in iTunes

NEW COUPON CODE EXCLUSIVELY FOR HANSELMINUTES LISTENERS: The folks at XCeed are giving Hanselminutes listeners that is Coupon Code "hm-20-20." It'll work on their online shop or over the phone. This is an amazing deal, and I encourage you to check our their stuff. The coupon is good for 20% off any component or suite, with or without subscription, for 1 developer all the way up to a site license.

Our sponsors are XCeed, CodeSmith Tools, PeterBlum and the .NET Dev Journal. There's a $100 off CodeSmith coupon for Hanselminutes listeners - it's coupon code HM100. Spread the word, now's the time to buy.

As I've said before this show comes to you with the audio expertise and stewardship of Carl Franklin. The name comes from Travis Illig, but the goal of the show is simple. Avoid wasting the listener's time. (and make the commute less boring)

  • The basic MP3 feed is here, and the iPod friendly one is here. There's a number of other ways you can get it (streaming, straight download, etc) that are all up on the site just below the fold. I use iTunes, myself, to listen to most podcasts, but I also use FeedDemon and it's built in support.
  • Note that for now, because of bandwidth constraints, the feeds always have just the current show. If you want to get an old show (and because many Podcasting Clients aren't smart enough to not download the file more than once) you can always find them at http://www.hanselminutes.com.
  • I have, and will, also include the enclosures to this feed you're reading, so if you're already subscribed to ComputerZen and you're not interested in cluttering your life with another feed, you have the choice to get the 'cast as well.
  • If there's a topic you'd like to hear, perhaps one that is better spoken than presented on a blog, or a great tool you can't live without, contact me and I'll get it in the queue!

Enjoy. Who knows what'll happen in the next show?

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook bluesky subscribe
About   Newsletter
Hosting By
Hosted on Linux using .NET in an Azure App Service

WatirMaker written again in Ruby

July 07, 2006 Comment on this post [1] Posted in ASP.NET | Ruby | Watir
Sponsored By

WatirmakerrubyI always thought WatirMaker was a pretty good idea. It was meant not as a recorder, per se, but rather as a "faster typer."

I use it to jump start spikes like my recent Vonage script. If I wrote Watir often enough I'd just use the Ruby interactive shell.

(By the way, if you  have 15 minutes - maybe it's lunch - visit here: http://tryruby.hobix.com/ and try Ruby out, guilt- and install-free, in your browser.)

After I did WatirMaker in C#, Michael Kelly and John Hann wrote it again in native Ruby with the tiniest bit of help from me early on. It's rockin' sweet IMHO.

You can run it by simply running "ruby watirmaker.rb" from the command-line or by redirecting to a file "ruby watirmaker.rb > myscript.rb."

John emailed me and said that he and Michael are going to look for a permanent home for this, but until then it's here.

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook bluesky subscribe
About   Newsletter
Hosting By
Hosted on Linux using .NET in an Azure App Service

Serializing Objects as JavaScript using Atlas, JSON.NET and AjaxPro

July 04, 2006 Comment on this post [25] Posted in ASP.NET | Ruby | Javascript | TechEd | Speaking | Web Services
Sponsored By

Ajax is shiny. In our talk at TechEd, Patrick and I mentioned that our next plan was a dynamic endpoint for our financial services that spoke JSON to complement our "Dirty SOAP" endpoint. This would make auto-complete dropdowns and sortable grids REALLY easy when interfacing with our SDK that already supports a large message set for banking-type things like GetPayees, GetAccountHistory.

The first step to make this happen will be JSON serialization round-tripping. For example, I'd like to take this object (instance)...

public class Person

{

    public string firstName = "Scott";

    public string lastName = "Hanselman";

    public DateTime birthDay = new DateTime(1970, 1, 15, 1, 1, 0);

    public decimal moneyInPocket = 4.5M;

}

...and serialize it to JSON thusly:

{"firstName":"Scott", "lastName":"Hanselman", "birthDay": new Date(1213260000), "moneyInPocket":4.5}

I was already planning to create a JavaScript serializer as Corillian already has fixed-length, delimited, name-value pair and other serializers for any object.

I took a look at JSON.NET thinking it'd be a nice, lightweight serializer, and while it's cool on initial first glance, this release didn't pass the "fall into the pit of success" test for an object serializer.

UPDATE: Json.NET has been updated and now works as expected and includes helper methods to make the job simpler...new code below.

    1 using System;

    2 using System.IO;

    3 using System.Collections.Generic;

    4 using System.Text;

    5 

    6 namespace ConsoleApplication1

    7 {

    8     public class Person

    9     {

   10         public string firstName = "Scott";

   11         public string lastName = "Hanselman";

   12         public DateTime birthDay = new DateTime(1970, 1, 15, 1, 1, 0);

   13         public decimal moneyInPocket = 4.5M;

   14     }

   15 

   16     class Program

   17     {

   18         static void Main(string[] args)

   19         {

   20             Person p = new Person();

   21             string output =  Newtonsoft.Json.JavaScriptConvert.SerializeObject(p);

   22 

   23             output = output.Replace("Scott", "Fred");

   24             output = output.Replace("Hanselman", "Jones");

   25 

   26             Person anotherP = Newtonsoft.Json.JavaScriptConvert.DeserializeObject(output, typeof(Person)) as Person;

   27             Console.WriteLine(anotherP.firstName + " " + anotherP.lastName);

   28         }

   29     }

   30 }

   31 

Then I tried Ajax.NET 6.7.2.1 from Michael Schwarz...

    1 using System;

    2 using System.IO;

    3 using System.Collections.Generic;

    4 using System.Text;

    5 using Microsoft.Web.Script.Serialization;

    6 

    7 namespace ConsoleApplication1

    8 {

    9     public class Person

   10     {

   11         public string firstName = "Scott";

   12         public string lastName = "Hanselman";

   13         public DateTime birthDay = new DateTime(1970, 1, 15, 1, 1, 0);

   14         public decimal moneyInPocket = 4.5M;

   15     }

   16 

   17     class Program

   18     {

   19         static void Main(string[] args)

   20         {

   21             Person p = new Person();

   22             string output = AjaxPro.JavaScriptSerializer.Serialize(p);

   23 

   24             output = output.Replace("Scott", "Fred");

   25             output = output.Replace("Hanselman", "Jones");

   26 

   27             Person anotherP = AjaxPro.JavaScriptDeserializer.DeserializeFromJson(output, typeof(Person)) as Person;

   28             Console.WriteLine(anotherP.firstName + " " + anotherP.lastName);

   29         }

   30     }

   31 }

   32 

...but got this exception on the deserialization. When reflectoring through the source, this implies that null was passed to DateTimeConverter.Deserialize. I think an ArgumentNullException would have been clearer.

System.NotSupportedException was unhandled
  Message="Specified method is not supported."
  Source="AjaxPro.2"
  StackTrace:
       at AjaxPro.DateTimeConverter.Deserialize(IJavaScriptObject o, Type t)
       at AjaxPro.JavaScriptDeserializer.Deserialize(IJavaScriptObject o, Type type)
       at AjaxPro.JavaScriptDeserializer.DeserializeCustomObject(JavaScriptObject o, Type type)
       at AjaxPro.JavaScriptDeserializer.Deserialize(IJavaScriptObject o, Type type)
       at AjaxPro.JavaScriptDeserializer.DeserializeFromJson(String json, Type type)
       at ConsoleApplication1.Program.Main(String[] args) in C:\Documents and Settings\Scott\Desktop\ConsoleApplication1\ConsoleApplication1\Program.cs:line 29

However, I was able to get it to round-trip when I removed the DateTime. Not sure what's up with that.

Also interesting, Ajax.NET (AjaxPro) saved the Type/Assembly Qualified Name in the resulting JSON. I can see why they'd want to do that, but one of the nice things about JavaScript and JSON in general is the cleanliness and flexibility of the wire format. This could complicate things if I've got different CLR types on the server consuming the same serialized JSON from the client. It also serializes the DateTime in a different way than I'm used to.

{"__type":"ConsoleApplication1.Person, ConsoleApplication1, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null", "firstName":"Scott", "lastName":"Hanselman", "birthDay": new Date(Date.UTC(1970,0,15,9,1,0,0)), "moneyInPocket":4.5}

Moving to Microsoft's Atlas, similar and slightly simpler code works just fine like this:

    1 using System;

    2 using System.IO;

    3 using System.Collections.Generic;

    4 using System.Text;

    5 using Microsoft.Web.Script.Serialization;

    6 

    7 namespace ConsoleApplication1

    8 {

    9     public class Person

   10     {

   11         public string firstName = "Scott";

   12         public string lastName = "Hanselman";

   13         public DateTime birthDay = new DateTime(1970, 1, 15, 1, 1, 0);

   14         public decimal moneyInPocket = 4.5M;

   15     }

   16 

   17     class Program

   18     {

   19         static void Main(string[] args)

   20         {

   21             Person p = new Person();

   22             string output = JavaScriptObjectSerializer.Serialize(p);

   23 

   24             output = output.Replace("Scott", "Fred");

   25             output = output.Replace("Hanselman", "Jones");

   26 

   27             Person anotherP = JavaScriptObjectDeserializer.Deserialize(output, typeof(Person)) as Person;

   28             Console.WriteLine(anotherP.firstName + " " + anotherP.lastName);

   29         }

   30     }

   31 }

   32 

...giving the expected output of

{"firstName":"Scott", "lastName":"Hanselman", "birthDay": new Date(1213260000), "moneyInPocket":4.5}

Between these three JSON serializers and this simple test, only ATLAS "just worked." (Assuming my 'simple test' isn't flawed.) For now I'll use these Atlas assemblies for my JSON serialization needs, but it'd be nice if I could back-port the chunky parts of one of the other libraries to .NET 1.1 my for projects that can't use 2.0.

UPDATE: Json.NET and Atlas appear to work the same with this simple test.

Of course, the really sad thing is that John Lam or my boss will step in here any minute and remind us that in Ruby on Rails you can just say object.to_json and get JSON strings. Phooey!

As a totally unrelated aside, and for the purposes of starting a discussion - in the JSON.NET source code the author James Newton-King appears to have decompiled the source for System.Web.UI.Util.QuoteJScriptString from ASP.NET and included the decompiled source (with a few modifications) directly in JSON.NET's JavaScriptUtils.cs which is then licensed under the Creative Commons Attribution 2.5 License. 

UPDATE: James Newton King updated the JSON Framework and removed the CC license from the JavaScriptUtils.cs file as it wasn't his intent to release Microsoft code like that. He also included an explanatory comment. Seems like a reasonable solution to me.

The question is: When a useful function exists in the .NET Framework, but is marked internal/private, for whatever reason. Is it better to:

A. decompile it and include it in your code anyway (hopefully with an explanatory comment).
B. write your own from scratch.
C. use the internal/private/whatever method via Reflection.

Discuss.

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook bluesky subscribe
About   Newsletter
Hosting By
Hosted on Linux using .NET in an Azure App Service

An Xml Tidy in PowerShell or Formatting Xml with Indenting with PowerShell

July 03, 2006 Comment on this post [4] Posted in PowerShell | XML
Sponsored By


I like my XML pretty. There's no format-xml cmdlet or tidy-xml in PowerShell, so here's my first try:

#Name me tidy-xml.ps1
# - this crap written by Scott Hanselman
[System.Reflection.Assembly]::LoadWithPartialName("System.Xml") > $null
$PRIVATE:tempString = ""
if ($args[0].GetType().Name -eq "XmlDocument")
{
 $PRIVATE:tempString = $args[0].get_outerXml()
}
if ($args[0].GetType().Name -eq "String")
{
 $PRIVATE:tempString = $args[0]
}
$r = new-object System.Xml.XmlTextReader(new-object System.IO.StringReader($PRIVATE:tempString))
$sw = new-object System.IO.StringWriter
$w = new-object System.Xml.XmlTextWriter($sw)
$w.Formatting = [System.Xml.Formatting]::Indented
do { $w.WriteNode($r, $false) } while ($r.Read())
$w.Close()
$r.Close()
$sw.ToString()

Sometimes XML is thought of as strings and sometimes as [xml] in PowerShell. This script will take either a string or [xml] but will always return a string. (e.g. It's on you to do the final [xml] cast because if you did, the tidying is moot). For example:

PS> $a = "<foo><bar>asdasd</bar></foo>"
PS> ./tidy-xml $a
<foo>
  <bar>asdasd</bar>
</foo>
PS> $b = [xml]"<foo><bar>asdasd</bar></foo>"
PS> ./tidy-xml $b
<foo>
  <bar>asdasd</bar>
</foo>

I wanted to make it so I could do these scenarios. Thoughts? Remember that I need to normalize to a string for the StringReader constructor.

#couldn't because it returned an Object[] of strings and it got sloppy fast
get-content foo.xml | tidy-xml

#couldn't because it (oddly) returned an ArrayList of strings and it got sloppy fast
get-content foo.xml -ov c
tidy-xml $c

Enjoy (or improve!)

UPDATE: Here's a better version that includes a number of best-practices changes as well as the support for taking IN objects from the pipeline (like I wanted originally):

#The following cases work
#
#PS>$a
#<foo><bar>this is A</bar></foo
#PS>$b.get_OuterXml()
#<foo><bar>this is B</bar></foo
#PS>Get-Content foo.xml
#<foo>
#   <bar>this is C</bar>
#</foo>
#
#Now try the following.
#PS>sal ti tidy-xml
#PS>$a | ti
#PS>$b | ti
#PS>$c | ti
#PS>ti $a
#PS>ti $b
#PS>ti $c
#PS>$a, $b | ti
#PS>$a, $c | ti
#PS>$c, $b | ti
#PS>$a, $b, $c | ti
#
#What doesn't work here is when you pass a multiple parameter input as follows:
#tidy-xml $a, $b # doesn't work
#
#Uhm, i think i would have to change my logic "completely" to actually get that to work...
#(after refactoring "process" block...)
#
#Name me tidy-xml.ps1
# - some of this crap written by Scott Hanselman
function Tidy-Xml {
    begin {
        $private:str = ""
       
        # recursively concatenate strings from passed-in arrays of schmutz
        # not sure how to improve this...
        function ConcatString ([object[]] $szArray) {
            # return string
            $private:rStr = ""

            # Recursively call itself, if a string is also of array or a collection type
            foreach ($private:sz in $szArray) {
                if (($private:sz.GetType().IsArray) -or `
                    ($private:sz -is [System.Collections.IList])) {
                    $private:rStr += ConcatString($private:sz)
                }
                elseif ($private:sz -is [xml]) {
                    $private:rStr += $private:sz.Get_OuterXml()
                }
                else {
                    $private:rStr += $private:sz
                }
            }
            return $private:rStr;
        }
       
        # Original "Tidy-Xml" portion
        function FormatXmlString ($arg) {
            # ignore parse errors
            trap { continue; }
           
            # out-null hides output of the assembly load
            [System.Reflection.Assembly]::LoadWithPartialName("System.Xml") | out-null

            $PRIVATE:tempString = ""
            if ($arg -is [xml]){
                $PRIVATE:tempString = $arg.get_outerXml()
            }
            if ($arg -is [string]){
                $PRIVATE:tempString = $arg
            }

            # the ` tick mark is a line-continuation char
            $r = new-object System.Xml.XmlTextReader(`
                new-object System.IO.StringReader($PRIVATE:tempString))
            $sw = new-object System.IO.StringWriter
            $w = new-object System.Xml.XmlTextWriter($sw)
            $w.Formatting = [System.Xml.Formatting]::Indented

            do { $w.WriteNode($r, $false) } while ($r.Read())

            $w.Close()
            $r.Close()
            $sw.ToString()
        }
    }
   
    process {
        # For non-xml strings or types, they will be buffered and will be
        # taken care of in "end" block
        
        # this checks for objects that have been "pipe'd" in.
        if ($_) {
            # check if whatever we have appended is a valid XML or not
            $private:xmlStr = ($private:str + $_) -as [xml]
           
            if ($private:xmlStr -ne $null) {
                FormatXmlString([xml]$private:xmlStr)
                # clear the string not to be handled in "end" block
                $private:str = $null
            } else {
                if ($_ -is [string]) {
                    $private:str += $_
                } elseif ($_ -is [xml]) {
                    FormatXmlString($_)
                }
                # for an array or a collection type,
                elseif ($_.Count) {
                    # iterate each item in the collection and append
                    foreach ($i in $_) {
                        $private:line += $i
                    }
                    $private:str += $private:line
                }
            }
        }
    }

    end {
        if ([string]::IsNullOrEmpty($private:str)) {
            $private:szXml = $(ConcatString($args)) -as [xml]
            if (! [string]::IsNullOrEmpty($private:szXml)) {
                FormatXmlString([xml]$private:szXml)
            }
        } else {
            FormatXmlString([xml]$private:str)
        }
    }
}

Thanks to MonadBlog for the Updates! There's definitely some room for refactoring of the begin/end/process, but it's more funcitonal this way.

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook bluesky subscribe
About   Newsletter
Hosting By
Hosted on Linux using .NET in an Azure App Service

Querying Virtual Server 2005 via VM with PowerShell

July 03, 2006 Comment on this post [4] Posted in PowerShell
Sponsored By

One of the guys in IT manages a lot of Virtual Server instances, like dozens, adding up into many dozens of Virtual Machines all supporting our many devs. He wanted to get some status information with PowerShell. Here's what I came up with.

We used WMI Explorer to check out the WMI namespace installed by Virtual Server (root/vm/virtualserver).

Given a CSV file like this full of (at least) Virtual Server ComputerName

computername,owner,whatever
MSVS1,fred,somedata
MSVS2,joe,somedata
MSVS3,luigi,somedata

We did this:

import-csv servers.csv | foreach-object
{  $_.computername   } |
foreach-object
{ Get-WmiObject -computername $_ -namespace "root/vm/virtualserver" -class VirtualMachine } |
select 
@{Expression={"__SERVER"}; Name="Server"},
Name, CpuUtilization,
select @{Expression={"Uptime/60"}; Name="Uptime(min)"},
PhysicalMemoryAllocated,
DiskSpaceUsed |
format-table -groupby Server -property name,CpuUtilization,Uptime,PhysicalMemoryAllocated,DiskSpaceUsed

Which gives us more or less this:

Vsserver1

Which can also reformat, make smaller and run in a loop to get a "top" equivalent for all our VMs. We can catch machines that are running out of space, working too hard, and do capacity planning.

Note, I originally wanted to do this:

import-csv servers.csv | Get-WmiObject -namespace "root/vm/virtualserver" -class VirtualMachine

and have "computername" automatically bound to because the name was the same in the CSV file and the powershell parameter name. This does work in this instance...make a CSV file like this named pids.csv:

ID
1
2
3
4
5

and execute this PowerShell pipeline

import-csv pids.csv | get-process

and you'll get

Get-Process : No process with process ID 1 was found.
At line:1 char:33
+ import-csv pids.csv | get-process <<<<
Get-Process : No process with process ID 2 was found.
At line:1 char:33
+ import-csv pids.csv | get-process <<<<
Get-Process : No process with process ID 3 was found.
At line:1 char:33
+ import-csv pids.csv | get-process <<<<

Handles  NPM(K)    PM(K)      WS(K) VM(M)   CPU(s)     Id ProcessName
-------  ------    -----      ----- -----   ------     -- -----------
   2399       0        0         32     2   507.75      4 System
Get-Process : No process with process ID 5 was found.
At line:1 char:33
+ import-csv pids.csv | get-process <<<<

See how it called get-process for each ID and it automatically bound the ID column of the table coming from the CSV to the ID property? I wanted to do the same with with computername, but it didn't work.

Get-WmiObject : The input object cannot be bound to any parameters for the comm
and either because the command does not take pipeline input or the input and it
s properties do not match any of the parameters that take pipeline input.
At line:1 char:21

I got this error which I assume means that Get-WMIObject doesn't take pipeline input in the build of PowerShell I have (RC1). I hope this is queued to get fixed ASAP or I'm just missing something.

UPDATE: A "help get-wmiobject" (duh, RTFM) confirms that -computername doesn't take pipeline input.

    -ComputerName <System.String[]>
        Declares on which computer(s) the WMI object may be found

        Parameter required?           false
        Parameter position?           named
        Parameter type                System.String[]
        Default value                 localhost
        Accept multiple values?       true
        Accepts pipeline input?       false
        Accepts wildcard characters?  false

Bummer.

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook bluesky subscribe
About   Newsletter
Hosting By
Hosted on Linux using .NET in an Azure App Service

Disclaimer: The opinions expressed herein are my own personal opinions and do not represent my employer's view in any way.