Windows Tech Support

  • Subscribe to our RSS feed.
  • Twitter
  • StumbleUpon
  • Reddit
  • Facebook
  • Digg

Saturday, 10 November 2012

Blog News and Updates!

Posted on 17:26 by Unknown
It's been awhile since I've posted anything mildly interesting here (even if you include the article I just posted before this).  But I have an announcement to make...

I've accepted an offer to write for 4Sysops.com!

I will post an update here when my first article is published, so please stay tuned.  I'm putting a lot of time into it, so I hope you find them useful and entertaining (yeah, I will still insert my stupid humor where I can).


Read More
Posted in articles, blogs, publishing, technology, writing | No comments

Crude But Effective (ConfigMgr Right-Click Tools Trickery)

Posted on 17:21 by Unknown
Intro:  I just took the wraps of this particular "feature" within a web application project I've been working on for some time now.  So I figured it was a good time to share some thoughts about why I spent the time and effort to make it work.  I'm not going to say it's 100% complete yet, and I still have some features to fill-out, but it's walking on two legs and says "Daddy!" so I'm kind of proud of it.  I actually submitted this for another blog site but it was rejected as not being within the topic set they prefer, so I'm posting it here. 

A Little Background

Anyone who grew up watching the original Star Trek series on TV should recall a particularly famous line quoted by Spock, where he said "Crude, but effective".  The implication made was that a “solution” doesn't always have to be elegant or optimal in order to be sufficient.  Hence the name of this article for the mini-project I’m about to describe and bore you to death. So, let’s get started!

One of the most widely-used tools in the world of Microsoft enterprise systems management, is System Center Configuration Manager.  One of the most widely-used tools to extend the functionality of Configuration Manager is (or are) the "SCCM Right-Click Tools", developed and supported by Rick Houchins (link).
The tool-set installs a set of scripts, and some XML extensions to the MMC console snap-in for Configuration Manager.  The result is an additional set of pop-out menus when you right-click on resources in the MMC console.  They are grouped into "Tools", "Actions", "Log Files" and so on, each having a set of links to perform useful tasks, upon a single resource (computer) or all of the resources in a selected Collection. Some of the features it provides include:
●     Invoke ConfigMgr Agent actions such as:
○     Hardware (and Software) Inventory
○     Machine (and User) Policy Retrieval and Evaluation
○     Discovery Data Collection Cycle
○     More
●     Run Client Tools such as:
○     Restart ConfigMgr Agent service
○     Uninstall/Re-install ConfigMgr Client
○     Re-Run Advertisements
●     View Client Log Files
●     View Reports for selected Clients or Collections

There are quite a few versions of this out in the wild, and I've rarely seen, or heard of two IT shops using the same (or even latest) version.  Regardless, Rick's product has become so popular and widely-known, that's it’s hard to find a ConfigMgr Administrator anywhere in the world that hasn't heard of it, let alone one that doesn't use it every day.  It’s even spawned inspired projects such as Client Tools (link) and SCCM Client Actions Tool (link).  Some have taken off, while others have not.  Ultimately, it's a good thing to inspire others to try good things for the good of others, is it not?
One of the larger projects I've been working on for the past year is a web-based tool for integrating and managing multiple enterprise "islands" of information to achieve an holistic management tool.  This involves Configuration Manager, Active Directory, legacy inventory management systems, multiple databases, and rolls all of that into a Role-Based Access Control interface that maps the features to the discrete functional groups within their IT department, as well as specific features made available to end users.
Some of you might wonder if this has anything to do with my old "Windows Web Admin" project that I killed a while ago.  The answer to that is "yes".  WWA formed the basis of this project, but if WWA was 1.0, this project is approximately 5.0.  There’s a lot of change and scaling out in this one, but it's genesis was WWA.  Okay, enough of that. Let's move on...
One of the most daunting challenges that I've been trying to solve is how to incorporate my own set of "client tools" into the web interface.  Why is this so difficult?  Primarily, the biggest concern is security risk and exposure.  There are quite a few potential ways to approach this, but let’s break it down in the most basic terms:

The Goal

The goal of this particular subset of the project is to be able to directly invoke processes on remote computers over a network connection, and initiate this from within a web browser.  Some aspects of the Right-Click tools are easy to implement via a web interface, such as exploring the C: drive, opening the remote log or cache folder, and ping for connectivity testing.  But the features which require invoking a WMI or WBEM/SWBEM interface remotely are a little more complicated to achieve from within a local web browser session.  At least they are for my limited set of abilities.
In the simplest terms, WBEM, or Web-Based Enterprise Management, is the web interface for WMI services on a given computer.  WBEM is the mechanism by which you connect to, and interact with, the ConfigMgr client on a remote computer.  It’s also how you connect to, and interact with the site server, but that’s for another article.
WMI and WBEM can be a little complicated to describe, but that’s not necessary for this article.  But you do need at least a basic understanding of WBEM as it pertains to "what it is", so that you can appreciate what’s going on under the hood when you turn the key and start this beast up.
The good news is that you don’t have to roll up your sleeves and get dirty with programming code in order to leverage WBEM's benefits.  There are packaged utilities that can do the messy work for you, such as the SendSchedule.exe utility included with the Microsoft ConfigMgr Toolkit v2.
There are probably more potential "options" to solving this dilemma, but I've boiled it down to three:

Option 1 - Client-Side Code

It could be done with some JavaScript code with JSON or JQuery, or whatever, running as a client-side process (on the computer where the browser is active).  This makes it possible to run in the context of the logged on user.
The problem with the client-side script option is security context and "sand-boxing" with respect to invoking other local scripts, or an executable, under the logged-on user context.  There's also the challenge of maintaining centralized access control and logging. The security model in this scenario relies on individual user accounts having permissions to invoke remote interfaces like the ConfigMgr Client Agent service.  This isn't a bad thing however, but it does depend on diligent administration of an AD security group.

Option 2 - Server-Side Code

It could be done with server-side code, but that would involve forked or marshaled processes running under the context of a proxy account.  Or it could be run in the context of the IIS application pool, or even the IIS web site.
The biggest problem with the server-side code approach is the use of a proxy user account, and controlling access to the folders and files in which the user context can execute.  The security model in this scenario is a single "proxy" user account, with permissions granted to allow it to invoke remote interfaces on client computers.

Option 3 - A Real Developer

It could also be done with custom programming using .NET or Java and a compiled executable or even a browser add-in.
The security model in this scenario could be either of the two described in the first two options above, or even a hybrid of both of them.  However, the less obvious "problem" with this approach comes down to complexity, time and resources.  Very often the fourth issue is budget.  In our case, we don’t have this as a viable option at our disposal.  What we do have at our disposal is.... me. 
That’s right.  Simple. Basic. Me.  My skill set is not the most robust on Earth, big shock, I know, but it does contain enough database, and coding skills, and a fetish for application design, to be dangerous.  And if you (ok, I) add a pinch of stupidity, sarcasm and bad humor, and a teaspoon of caffeine to the mix, you have a concoction that get it done.  So this led me to option 4...

Option 4 - Duct Tape, Chewing Gum, and Bailing Wire

The old McGyver approach.  This is actually a very old method, but it's a tried-and-true method, that has stood the test of time and many, many projects.  It's the old "web-database-scheduler" approach.  Let me digress...
In the most basic terms possible:
There’s a web interface for submitting the requested "action" to be performed on a remote client.  This captures the basic information: the client (or collection) name, and the action to be performed.  Before you start flapping away about which language is "best" for this role, I’ll just gently close your lips with my greasy fingers, encased in old welding gloves, and whisper: "shhhhhhhh... it doesn’t really matter."  It's true. You could crank this out using PHP, ASP, ASP.NET, Ruby, Python, Mython, Yourthon, Therethon or Whateverthon.  As long as it can display a web form in a browser session, collect the input, and interact with a database to store the information, you’re good to go.
Next, there's a database table for storing the submitted requests entered from the web form.  This includes the client name, the action to be performed, as well as who requested it, and when (date and time), and task-related things like "is-completed" and when, along with other optional pieces of information.
Then there's a scheduled task, which reads the database table, on a frequent and recurring schedule, fetching only those rows which have not already been processed (completed), and executes the requested action on the specified remote computer.  After each task is completed, the corresponding row in the database table is updated to indicate it was completed and time-stamped.  This is what effectively prevents the entire process from melting down by re-running every row every time.
So, putting this all together, you get a process that works like this:
1      Authorized user of the web site opens a web page for a particular computer or Collection of computers, and clicks a button/link for "Client Tools".  This opens a web form with a list of available "actions" to perform on the computer(s) remotely.  User selects the desired action and clicks "Submit".  The information is then entered into a database table.  In my case, I'm using SQL Server 2008 R2.  But you could use Oracle, MySQL, Sybase, Informix, DB2, or just about anything that’s "robust" enough to support a business environment with multiple users.
2      The scheduled task, running under the context of a proxy account with permissions to invoke client agent actions remotely, executes a script on the next cycle.  The script reads all rows which are not yet marked as being completed.  Iterating through the set of rows, it reads the name of the computer to be acted upon, and the requested "action" to invoke.  The script checks for connectivity to the remote computer, and then executes the remote action using either SWBEM interface (via COM or .NET), or in the case of my lazy-ass approach: executes the SendSchedule.exe utility (included with the ConfigMgr Toolkit v2 download).  After running the task, it updates the row to set the "completed" field and enters a time-stamp to indicate when it was processed.
3      The remote client receives the request from the remote script execution, under the user context of the scheduled task/job that launched it.  It then verifies authentication and, if allowed, invokes the client action or other (possibly) custom task.
Clunky?  Yep.  Complicated?  Not really (I've seen things MUCH more complicated doing much less).  Could it be done more simply or more elegantly?  You betcha!  

Some Advantages

So, what additional benefits does this approach provide?  For starters, since the action is really based on a SQL database repository, and a job scheduler, I have a centralized model.  That means I have the means to log everything going on.  Now, instead of every console-user running a local task, with log files on their computer and the remote computers, everything is in one place, where it's easy to sort and manage and get useful reports out.  It's also easy to apply a security model to restrict access in one place at one time.  I'm not going to say web applications are a panacea, but they do offer some very attractive capabilities.

Here's a few screen shots of it.  The first image is the Resource details view, which is showing the general "Computer System" properties.  The "Client Tools" button is at the upper-right corner.


After clicking the "Client Tools" button, the pop-up form is shown (below).  Right now, I only have three of the Client Actions exposed, not because there's a problem with them, but because I'm working on role-based filtering of features. The user session for this example doesn't have access to the other actions.

The image below is the log report, which captures every submitted request and shows when it was processed and the result.


Conclusion Contusion

Could this all have been a different/better way?  I'm sure it could have, but I'm working against two huge constraints: time and skill set.  Time is very limited and my skill set is still mostly ASP/SQL.  I've done a lot with PHP also, but in this environment it didn't make sense to shoehorn it in.  I used to work with ASP.NET for a brief period, but that was a while ago and I haven't had the opportunity to brush up on the newer technologies.  I know: excuses-excuses.  Feh.
The third constraint is budget.  Budgets are awesome.  If only we had one.  For now, duct tape and chewing gum will do just fine.
Read More
Posted in asp, config manager, logging, network administration, sccm, sql server, system center, web development, wmi | No comments

Sunday, 4 November 2012

Why I'm Still Not 100% on PowerShell

Posted on 05:13 by Unknown
In the past, I have been somewhat critical of PowerShell.  Not because it is somehow technically inferior to alternatives, but because of environmental ramifications.  I was an early adopter actually, having joined up in the Monad testing program, and I was very excited about the potential it offered.  Today, I use PowerShell more than I ever have, but there are still many types of tasks that I don't use it for:

  • Deploying / Installing Software to Remote Computers
  • Maintaining Legacy Script Files
  • Heterogeneous Windows Versions and .NET versions
The main reason I don't use it for software deployments is the slower execution time.  Compared with basic Batch/CMD scripts, or even VBscript, it just takes longer to "spin-up" the .NET and PowerShell foundation goodies before it even parses the script code.  Multiply that by multiple installs per remote computer, and hundreds of remote computers, and the aggregate time difference can be significant.  This is especially true on older hardware and older operating systems, which brings up the point of "pervasiveness" of PowerShell in a mixed environment:  

Many environments I walk into (I'm a consultant after all) are not homogeneous when it comes to operating systems versions, or even "common applications".  It's not unusual to find multiple configurations of .NET, Java Runtime, DirectX, MSXML, Oracle client, and SQL Native Client installations.  One thing I rarely have to contend with is inconsistent support for Windows Scripting Host, let alone the ever-present CMD shell.  I can't say the same for PowerShell.  I wish it were as consistent and pervasive but it's just not.  I'm sure that it will be someday, but we all know how long it takes customers to upgrade to newer operating systems.

To be fair, the same was true for Windows Scripting Host back in the early days of Windows NT.  Microsoft cranked out several versions, until they finally stopped on 5.8 and let it sink in.  That had a nice impact on most shops since they were no longer worried that as soon as they deployed WSH they would have to follow up with another upgrade.  PowerShell is still evolving, so many IT managers aren't over-eager to deploy 3.0 when they seem to expect a 3.1 to come out any day.

I know it's not exactly logical to expect that PowerShell and .NET could be somehow combined and made to be more cohesive (for more streamlined deployment), it would help.  The size of such a deployment package would be excessive for many shops to deal with, and might be tough to deploy on legacy hardware when local disk storage is almost maxed out.  No, it seems Microsoft is betting on customers upgrading to Windows 8 and that would take care of everything as it pertains to achieving a ubiquitous PowerShell presence.  I don't think that's going to happen anytime soon however, for a variety of reasons.  So, for now, while I continue to expand my PowerShell scope, I am still dependent on VBScript and Batch scripts for many tasks.

I'm sure there are some of you out there that will shake your head in disbelief at all this, and that's fine.  I welcome constructive feedback.  So if you have some insights or ideas about how this can be managed more effectively, please let me know?
Read More
Posted in batch, cmd, network administration, powershell, scripting, software deployment, vbscript, windows8 | No comments

Wednesday, 17 October 2012

When is "Better" Really "Better"?

Posted on 16:52 by Unknown
Years ago, like, way back in 2003, there was a thing that large software vendors used to do called an "open Beta" program.  This was when the vendor would invite large numbers of current and potential customers to kick the tires on forthcoming new products, or new versions of existing products.  Phew! That was a long-winded sentence, wasn't it?

Nearly every vendor, small to large, supported this approach to vetting their ideas for marketability.  Some still exist.  Some are long gone.  Microsoft.  IBM.  Novell (who?).  Allaire, later eaten by Macromedia, later to be gobbled up by Adobe.  Autodesk. Sun Microsystems (now Oracle).  Even WinZip had a "beta program" you could sign up for.  The process was simple: You enrolled, downloaded the binaries, installed them and agreed to submit bug reports and feature requests.  In the end, when the program concluded, you (most-often) received a complimentary license for the final product result.  Many participants earned licenses for Windows, Office, AutoCAD, WinZip, ColdFusion, Dreamweaver, and many other products.

The best part of this whole era was the blurred distinction between the textbook definitions of "Alpha" and "Beta" as they pertain to software development.

You see, in the early days, when guys wore white button-down shirts with pocket protectors, and taped up glasses with slicked-over hair (yes, think IBM in the 1970's), the term "Alpha" meant that features were still in a state of flux, while "Beta" meant features were locked-down, but functionality and reliability needed to be tested.  That's a big difference.  Luckily for us (consumers/geeks), most of these vendors had spent so much time on the bong wagon that they forgot about such distinctions.  It was fairly common to engage an engineer in a forum, or by e-mail, even by phone, to brainstorm about a new feature or idea. You could expect frequent build updates to reveal new features and options, even new UI changes, along the way.

Those days are gone.  They have since been replaced with "Community Preview" and "Pre-Release" programs.  These are died-in-the-wool "Beta" programs, where the ear for new ideas is Deaf, and all the vendor wants from you is a thumbs-up.  If you have a thumbs-down, well, you're obviously not in their target demographic.  When you encounter something you don't care for, you find yourself at a loss for avenues to submit your counter-suggestions back to the vendor.  They really don't want to hear about it now.  It's too late in the process.  The bean-counters have put a stake in the ground for target release dates, and communicated it to their "platinum" partners and shareholders.  To make drastic changes at that stage might risk their credibility with people who have NO idea what the "soft" in "software" really means.  But they have the money and that's all that counts.

Take for example this situation:

You release an early "preview" version of the next version of your flagship product to the masses.  The media outlets are given an even earlier "preview" so they can start the hype machine in full motion, thereby drumming up increased desire in the geek minds.  The hype machine vortex is at stage two now.  Stage one is when you start blabbering to the press about new features before letting anyone actually see them.

Within the first week, the Internet is a-buzz with some particular aspect of your new feature-set that users really don't care for.  How bad do they not care for it?  So bad that they start building add-on software to disable or modify that feature.

If that doesn't speak directly into the eyeballs and brains of those driving the machine (in this case: YOU), then you are already losing the battle.

Any half-decent first-year marketing student would say that if your customers are consistently and overwhelmingly changing your product in a very specific and consistent manner, then it's time you modify your product so they don't have to.  It's called "customer satisfaction".

Ok.  I've danced around this as thinly as I can without saying the obvious, so I'm just going to say the obvious:

Windows 8 "metro" tile interface might be nice for a touch-screen medium, but a mouse and keyboard (you know, the kind that almost ALL of Microsoft's customers still use) it sucks.  That's not just my opinion, you can search Google or Bing or whatever you prefer and find plenty of gripes about the new interface on a traditional platform.  And if anyone thinks the new UI is enough force du jour to entice even half of their install base to dump their desktops and buy Surface products to replace them, well, you need to put down the Kool Aid.  But Microsoft says the new UI is "better".  "Better" by who's rationale?  Theirs?  Obviously it's not the rationale of the thousands, maybe millions of users that have installed add-ons like ClassicShell or Start8, which suppress the tile interface and "restore" a Windows 7 style Start Menu.

I guess we're supposed to stop deciding for ourselves and accept what we're told?  My ancestors would roll over in their graves at the thought of that.

(image copyright: Lifehacker.com)
Don't get me wrong, there's a lot to like about Windows 8.  But just like Microsoft did with Windows 7, they missed the marketing train.  Microsoft seems to insist on aiming their hype cannon at the general consumer, when their bread-and-butter base is really in enterprise environments and large organizations.  Enterprises are more interested in things like automation, manageability, customization, configuration management, update management, security, reliability... and things that boil down to LOWER COST and INCREASED OUTPUT from their operations.  But they hardly spewed a word about those things when promoting Windows 7, except within small circles (TechNet, MSDN, etc.).

I lost count of IT professionals who struggled to build their own business cases to convince their management to consider upgrading from XP to 7, when Vista left them totally cold.  They shouldn't have had to do their own legwork.  Microsoft should have sold it for them, but their big-dollar bullhorn was blasting out how "cool" it was/is, and the music and games, and trying to be cool and trendy with ads that mimic their contemporaries.  I see the same thing happening with Windows 8, unfortunately.

My prediction is that an "option" will emerge to switch between them, at least on traditional platforms (desktops and laptops).  Time will tell if Tiles are better than Start Menus and Desktops.
Read More
Posted in business, marketing, predictions, software development, windows8 | No comments

Sunday, 14 October 2012

Config Manager Queries: CPU Types

Posted on 19:54 by Unknown
I probably should revive my old ScriptZilla blog for stuff like this, but the heck with it.  I'm just posting all this here from now on.  After all: It is the brain-skattering blogness that I'm kicking around, if that even makes any sense.

This is a simple SQL query to fetch all the unique CPU manufacturers and names within your inventoried ball of confusion...

SELECT DISTINCT Manufacturer, Name, COUNT(Name) AS QTY
FROM dbo.v_LU_CPU
GROUP BY Manufacturer, Name
ORDER BY Manufacturer, Name

Here's an example using VBScript (example is using a DSN-less connection with an explicit SQL user account and password. You can obviously run this under SSPI or "trusted" context, or using a stored DSN)

dsn = "DRIVER=SQL Server;SERVER=DBServer1;database=SMS_ABC;UID=username;PWD=password;"
Set conn = CreateObject("ADODB.Connection")
Set cmd = CreateObject("ADODB.Command")
Set rs = CreateObject("ADODB.Recordset")

conn.Open dsn

rs.CursorLocation = adUseClient
rs.CursorType = adOpenStatic
rs.LockType = adLockReadOnly

Set cmd.ActiveConnection = conn

cmd.CommandType = adCmdText
cmd.CommandText = query
rs.Open cmd

If Not(rs.BOF And rs.EOF) Then
xrows = rs.RecordCount
Do Until rs.EOF
For i = 0 to rs.Fields.Count -1
wscript.Echo rs.Fields(i).Name & vbTab & rs.Fields(i).Value
Next
rs.MoveNext
Loop
Else
wscript.echo "no records found, bummer."
End If

found = False
rs.Close
conn.Close
Set rs = Nothing
Set cmd = Nothing
Set conn = Nothing

I was going to post a PowerShell example, but going from v2 to v3 I'm finding all sorts of confusing recommendations about the "best way" to invoke a simple T-SQL "SELECT" query against a remote SQL Server that my head is already spinning. Even some that just recommend installing custom cmdlet extensions, and whatnot. If anyone wants to point me to a nice, simple, concise, example (i.e. equal or fewer lines of code than the VBScript example above) please post a reply. Gracias!
Read More
Posted in config manager, data mining, databases, reporting, sccm, sql server, system center, t-sql | No comments

Wednesday, 10 October 2012

Blog Changes - Pardon the Frustration

Posted on 18:49 by Unknown
I've found the "dynamic" templates on Blogger to be less than ideal, so I'm trying on different template themes and options to see what works best for various browsers, connection speeds and mobile devices.  Please pardon the interruption and frustration as I figure this out?  Thank you!
Read More
Posted in blogs | No comments

Software Feature Entropy Cycles - Part 2, Example

Posted on 18:12 by Unknown
I figured I might need to provide some concrete elaboration on my previous post about "Software Feature Entropy Cycles", so here goes.

Back in the late 1980's, while working for a "Naval Architecture and Engineering Firm", I began my career as a programmer.  According to my IRS tax return, my job was "Senior Engineer Technician", which was basically a glorified drafter, who also does some calculations.  My real job was writing tons and tons of LISP code for AutoCAD R10 and R11, and eventually R12 and onward, to automate design processes for piping and HVAC systems on U.S. Navy warships.

One of the interesting aspects about the world of CAD is that every niche industry has evolved its own unique "standards" of design and drafting.  From sheet borders, to dimensions and callouts, to tables and lists, to fonts and font sizes, to colors and layers.  You name it, I've seen every bizarre permutation of "standards" you can imagine. Metric vs. Standard. All Modelspace vs. Modelspace/Paperspace vs. all Paperspace.  Microscale and Macroscale.  Orthographic and Isometric and Oblique and Perspective, and whatever.

Some of the "standards" required for U.S. Naval design drawings were "NEVER CROSS CALLOUT LEADERS" and "NEVER CROSS A DIMENSION FEATURE WITH A CALLOUT LEADER".  Of course, in reality that wasn't always possible.  Some drawings contains such complicated spewage of goo (technical term for tons and tons of shit, making the end result difficult to read and make sense of), so breaking those "rules" was not only hard to avoid, it was downright required.

So, myself and another (much better) programmer, named Brad, went to work making some LISP routines to automatically detect when a Leader crossed another Leader or Dimension and automatically broke out a "gap" at the intersection.  Later versions of AutoCAD actually supported "real" LEADER entities, and also added the GROUP entity, so we updated our code to break the leader, remove the arrow head from the label leg, and applied both parts into a single GROUP entity to retain some behavioral integrity.  It worked pretty well actually.

Then Autodesk released "Bonus Tools", later renamed as "Express Tools", which included a leader gap feature that worked as well as ours, maybe better.  So we deprecated our code and moved on.  This is a fairly typical iterative process for most developers:  As the base product/technology/platform gains new features which previously required custom extensions, those custom extensions become unnecessary and deprecated.  Remaining feature "gaps" continue to be filled by custom extensions (aka program code), and newly-identified gaps are addressed with new extensions, and the cycle continues.

That's a pretty simple, yet concise example of a Software Feature Entropy Cycle.
Read More
Posted in applications, autocad, autolisp, automation, programming, software development | No comments
Newer Posts Older Posts Home
Subscribe to: Posts (Atom)

Popular Posts

  • Voting Time: Help Me Out?
    I need to get a better view of how I should manage this blog if I'm going to keep at it. I'd like to know how you typically discover...
  • A World Without Competition
    Try to imagine what things would be like today had there not been fierce competition in certain key parts of our world.  I’ll give you some ...
  • Book Update
    I posted some gibberish a few weeks ago about another book project.  Well, I'm getting close to wrapping it up, so I thought I'd go ...
  • Cost
    Software technology, like any technology, provides a means to solving problems.  Some big. Some small.  Some that help.  Some that hurt.  An...
  • Windows 7: Default User vs All Users
    A lot of confusion seems to occur with understanding the difference between the "Default User" profile, and the "All Users...
  • Time to Give Props
    With the ever-expanding volume and breadth of information on the Internet today, it's easy to focus on my own thoughts, experiences, ide...
  • Table of Contents (Preliminary)
    Here's the preliminary Table of Contents for my new book "The AutoCAD Network Administrator's Bible - 2013 Edition".  I...
  • The Nicest IT and IT Vendor Folks I Know
    I've ranted many times before how it's unfair to "hate" an entire company, without providing a rationale for it based on s...
  • Windows 8
    Two small, yet irritating things, that I hope Windows 8 addresses with respect to Windows 7: Being able to put the Recycle Bin in the S...
  • Stupid Assumptions
    After years of watching sci-fi TV shows, movies, etc. it's finally come to a point where even the so-called brightest of our authors and...

Categories

  • a
  • activation
  • active directory
  • advertising
  • agile
  • agility
  • amazon
  • american
  • apple
  • application virtualization
  • applications
  • art
  • articles
  • asp
  • augi
  • authors
  • autocad
  • AutoCAD Autodesk
  • autodesk
  • autolisp
  • automation
  • automotive
  • backups
  • batch
  • beer
  • beta
  • blackberry
  • blogs
  • bongloads
  • book
  • books
  • Books writing kindle amazon technology business projects
  • browsers
  • business
  • cad
  • career
  • certification
  • chrome
  • city government
  • civilization
  • cloud services
  • cmd
  • cmmi
  • comedy
  • command
  • community
  • computers
  • conferences
  • config manager
  • consultants
  • consulting
  • contracting
  • cranium drainium
  • crapware
  • culture
  • data center
  • data mining
  • databases
  • deployment
  • directx
  • DLL
  • domains
  • dumb
  • earth
  • economy
  • editor
  • education
  • election
  • elections
  • employment
  • engineering
  • entertainment
  • environment
  • error monitoring
  • events
  • exchange
  • facebook
  • family
  • firefox
  • flexnet
  • fud
  • fun
  • funny
  • games
  • gary vaynerchuk
  • gmail
  • google
  • government
  • group policy
  • hampton roads
  • health
  • history
  • holidays
  • home
  • html5
  • humor
  • hyper-v
  • iis
  • industry
  • infrastructure
  • installation
  • installshield
  • internet
  • internet explorer
  • interviews
  • jobs
  • jtbworld
  • kindle
  • kixtart
  • lab setup
  • languages
  • ldap
  • learning
  • legal
  • licensing
  • life
  • lifecycle
  • linux
  • lisp
  • logging
  • management
  • manufacturing
  • marketing
  • markets
  • mdop
  • mdt
  • medical
  • messaging
  • microsoft
  • microsoft access
  • military
  • mountains
  • movies
  • mozilla
  • music
  • nature
  • network administration
  • news
  • nook
  • nothing
  • office
  • open source
  • openoffice
  • opera
  • operating systems
  • oracle
  • osx
  • packaging
  • patches
  • people
  • photos
  • podcasts
  • policy
  • politics
  • powershell
  • predictions
  • process automation
  • products
  • programming
  • projects
  • psychology
  • publishing
  • rail
  • reading
  • registry
  • religion
  • reporting
  • reviews
  • rsat
  • rss
  • safari
  • safety
  • sales
  • satire
  • sccm
  • scheduling
  • science
  • scripting
  • search
  • security
  • servers
  • services
  • sharepoint
  • shopping
  • sms
  • social stuff
  • society
  • softgrid
  • software assurance
  • software deployment
  • software development
  • software packaging
  • sony
  • speaking
  • sports
  • sql express
  • sql server
  • statistics
  • Statistics news marketing
  • steve jobs
  • stories
  • stuff
  • stupidity
  • symantec
  • sysinternals
  • system center
  • systems architecture
  • t-sql
  • taxes
  • technet
  • technical support
  • technology
  • TED
  • ted talks
  • testing
  • textpad
  • thoughts
  • traffic
  • training
  • transportation
  • travel
  • troubleshooting
  • tutorials
  • twitter
  • ubuntu
  • unattend
  • unemployment
  • updates
  • upfront ezine
  • utilities
  • vacation
  • vba
  • vbscript
  • video
  • virginia
  • virginia beach
  • virtualization
  • visual lisp
  • vmware
  • vmware server
  • voting
  • war
  • weather
  • web
  • web browsers
  • web development
  • web sites
  • windows
  • windows 7
  • windows live
  • windows server
  • windows server 2012
  • windows8
  • winpe
  • wise
  • wmi
  • work
  • writing
  • ws08
  • wsus
  • wwa
  • x64
  • xml
  • ze frank

Blog Archive

  • ▼  2013 (37)
    • ▼  October (1)
      • 10 Questions: With Ralph Grabowski
    • ►  September (5)
    • ►  August (8)
    • ►  July (2)
    • ►  June (4)
    • ►  May (4)
    • ►  April (2)
    • ►  March (2)
    • ►  February (8)
    • ►  January (1)
  • ►  2012 (120)
    • ►  December (14)
    • ►  November (12)
    • ►  October (10)
    • ►  September (7)
    • ►  August (3)
    • ►  July (2)
    • ►  June (6)
    • ►  May (6)
    • ►  April (20)
    • ►  March (16)
    • ►  February (18)
    • ►  January (6)
  • ►  2011 (343)
    • ►  December (15)
    • ►  November (23)
    • ►  October (27)
    • ►  September (35)
    • ►  August (29)
    • ►  July (17)
    • ►  June (23)
    • ►  May (20)
    • ►  April (38)
    • ►  March (61)
    • ►  February (54)
    • ►  January (1)
Powered by Blogger.

About Me

Unknown
View my complete profile