Windows Tech Support

  • Subscribe to our RSS feed.
  • Twitter
  • StumbleUpon
  • Reddit
  • Facebook
  • Digg

Monday, 30 April 2012

If It Ain't Broke...

Posted on 02:00 by Unknown
Something that has always bothered me about a lot of folks who work in the "IT" field is this:  They insist that "newer is better" as knee-jerk perspective, while many of them drive old cars... by choice.  For example, when I had a discussion with a colleague who was adamant that VBScript and BAT are "dead" and PowerShell is the "future".  I countered that they are not dead, but that PowerShell is simply another guest at the party.  Then he countered with the "newer is better" mantra.  Then I asked why he still drives a 1968 Chevy Camaro SS.  He started to answer and then stumbled.  I followed with the following rationale...
  • Surely the "new" Camaro is engineered better (more efficient, safer, etc.)
  • The "new" Camaro has newer technology features
  • The "new" Camaro is built with more modern materials
So, why then does he not dump his '68 model for the 2013 model?

A-ha!

To be fair, it's not even about PowerShell.  It's not about any ONE language or technology.  It's about the whole mindset that one thing MUST replace another, rather than be enjoined to the community of potential tools to solve problems.  Medicine is a great example of this.  Even with the newest modern advances in medical technology and medical procedures, many of the most commonly used procedures date back decades, even centuries.  While the tools become more advanced, a drill is still a drill and a saw is still a saw.  Hopefully, you get where I'm going with this.

One tiny, little, itsy-bitsy example is this all-too-common approach to dealing with the Registry.  Someone says "hey!  I need you to delete a few Registry keys with a script... fast!"

You pull out the tool you know best, maybe it's VBScript, and you do something like this...

Dim WshShell, Value1, Value2

Value1= "HKEY_LOCAL_MACHINE\SOFTWARE\Fubar 2013\StupidKey1"
Value2="HKEY_CURRENT_USER\Software\Fubar 2013\StupidKey1"

Set WshShell = WScript.CreateObject("WScript.Shell")

On Error Resume Next
WshShell.RegDelete Value1
WshShell.RegDelete Value2

There's obviously nothing wrong with that approach.  But why not just do it with TWO lines of code using REG.exe and a CMD shell script?  Command Line tools are often very powerful and often more compact with regards to syntax.  Some that come to mind are DISM, APPCMD, REG, WMIC, SC, ROBOCOPY, and even CACLS and REGINI.

reg delete "HKLM\SOFTWARE\Fubar 2013\StupidKey1" /f
reg delete "HKCU\SOFTWARE\Fubar 2013\StupidKey1" /f

It really doesn't matter which two, or three, or six languages you want to compare and contrast.  Languages are tools.  Nothing more.  PERIOD.  Anyone who gets into a heated argument over which programming language is "best" deserves an open-hand smack.  Ok, enough of that, back to the discussion...

The ideal approach is to learn as many options as you can.  The more options the better.  You CANNOT learn too many options in any profession.  That is the crux of becoming a master at any trade.  Whether you are a carpenter, a painter, a brick mason, a surgeon, a soldier, or a programmer... the more options you learn the better equipped you'll be at facing unexpected or unfamiliar problems, and the faster you'll be at addressing the familiar problems, and more efficiently.  That is what they call "value-add".  Learning.

So, before you write something (or someone) off as obsolete, think again.  Maybe it's still useful. Maybe it's not really "broke".  Maybe the newer alternatives offer clear advantages in some situations, but not in *all* situations.  Rather than throwing out old options as a standard practice isn't really so good after all.  If it's still there, at least give it consideration.  You never know when the "old" option might be the "better" option.

Now, this is where my Perl colleagues will say "I can do that in one line" and chuckle.  I hate those guys.
Read More
Posted in learning, network administration, registry, scripting, technology, training | No comments

Sunday, 29 April 2012

Project Taphouse

Posted on 05:54 by Unknown
There's not really a Project called "Taphouse", at least not that I'm working on.  But I'm calling it that because I can't really discuss what it's really for or for whom it is being built.  However, it has been (and still is) an interesting project.  It's forced me to drill into Google for lots of command goodies I knew about, but hadn't used in a long time.  Many I hadn't had to use in the way I'm using them now either.  So I thought I'd share some of this in case it's helpful to others.

Here's the basis of the requirement:

The Requirements

"Mobile tablets or laptops will be used to collect inventory data from remote locations using a wireless handheld barcode scanner device.  The remote locations will not always have accessible WiFi, nor a reliable 3G or 4G signal.  Further, testing at locations that have 4G LTE coverage indicates excessive battery drain when using active LTE communication.  Requesting a client-based inventory collection tool that can download configuration updates, as well as upload inventory data, but only when attached to the base network.  Material costs must be kept to absolute minimum."

This led to five minutes of head-scratching, a little Google searching, and finally an "a-ha!" moment:

A local web app using a local database.  The web app will offer the means for capturing inventory scan data, manual entry (when needed), and provide upload/download capability when connectivity allows.

The Ingredients
  1. Windows 7 tablet or laptop
  2. Local IIS instance and virtual directory configuration (windows authentication, disable anonymous authentication, to allow tracking of entries by logged on user)
  3. ASP or ASP.NET web app
  4. SQL Server 2008 R2 Express
  5. Coffee
  6. Sugar snacks
  7. Bad music
(Note: steps 2, 3, 4 can be done in any order, but it helps to move step 5 to step 1 sometimes)

The Deliverable
  1. A packaged installation that can be easily deployed to laptops or tablets, either manually, or via Configuration Manager advertisement, to enable user to begin capturing inventory data at remote locations in the field.
The Chunks
  • Step 1 is easy enough.  
  • Step 2 was interesting.  First I used the DISM command to install and configure IIS, the necessary component features, and Windows Authentication.
  • Step 3 involves creating the virtual directory target folder, copying in the web app content files, and then using APPCMD to create and configure the web site and virtual folder settings
  • Step 4 involves creating a configuration (response) file for installing SQL Express 2008 R2 to allow for silent installation on other computers.
Install and Configure IIS features

dism /Online /Enable-Feature /FeatureName:IIS-WebServerRole
dism /OnLine /Enable-Feature /FeatureName:IIS-WebServer
dism /OnLine /Enable-Feature /FeatureName:IIS-ApplicationDevelopment
dism /OnLine /Enable-Feature /FeatureName:IIS-ISAPIExtensions
dism /OnLine /Enable-Feature /FeatureName:IIS-ASP
dism /OnLine /Enable-Feature /FeatureName:IIS-WebServerManagementTools
dism /OnLine /Enable-Feature /FeatureName:IIS-Security
dism /OnLine /Enable-Feature /FeatureName:IIS-WindowsAuthentication

Create Virtual Folder


I created the folder "inventory" beneath "c:\inetpub\wwwroot", but you could put it anywhere really.  The main thing is to poing the "/PhysicalPath:" parameter to the appropriate location.  The APPCMD command is easy to use for creating and configuring the virtual directory.  For this project, I'm building the virtual directory under the Default web site.

appcmd add vdir /app.name:"Default Web Site/" /path:/inventory /physicalPath:c:\inetpub\wwwroot\inventory
appcmd set config "Default Web Site" /section:windowsAuthentication /enabled:true /commit:apphost
appcmd set config "Default Web Site" /section:anonymousAuthentication /enabled:false /commit:apphost

Install SQL Server 2008 R2 Express

Actually, the first step is to launch the installer (.exe) from a CMD console using /Action=Install /UIMode=Normal  - This enables the "Ready to Install" step in the left-hand vertical list of steps shown in the installation dialog.  For whatever reason, if I simply double-click the .exe and run the installation "normally" it doesn't show this feature.  You need the "Ready to Install" feature because it's the only place the shows the path to the "ConfigurationFile.ini" response file it creates, and allows you to abort a "real" installation at the final step and keep the .INI file.

Once you have the .INI, the CMD syntax for using it is pretty simple, but you still need to make some very minor modifications to the .INI first.

INI modifications

Add:  IACCEPTSQLSERVERLICENSETERMS=1
Change:  QUIETSIMPLE to "True"
Change:  SECURITYMODE to "SQL"  (you don't have to do this, but I prefer to)
Add:  SAPWD=[enter a strong password here]
Change:  TCPENABLED to "1"
Change   NPENABLED to "1"

You could set QUIET="True" for no dialog display, but I like to see some progress since it's pretty slow and pauses a few times along the way.

The installation syntax (assumes both files are in the same folder):

SQLEXPRWT_x86_ENU.exe /ConfigurationFile=ConfigurationFile.ini

But wait - There's more!


After installing the database, I still need to automate the setup of the database schema, create tables, users and grant permissions.  I created some T-SQL scripts and saved them in a folder.  Then I use the SQLCMD command to execute them using the default "sa" user account (note that I've replaced the actual password with "**" below, but you have to specify the actual password)

cd "\Program Files\Microsoft SQL Server\100\Tools\Binn"
sqlcmd -U sa -P ** -S %ComputerName%\INVENTORY -i 1_create_database.sql -o "%temp%\1_sql.log"
sqlcmd -U sa -P ** -S %ComputerName%\INVENTORY -i 2_create_login.sql -o "%temp%\2_sql.log"
sqlcmd -U sa -P ** -S %ComputerName%\INVENTORY -i 3_create_user.sql -o "%temp%\3_sql.log"
sqlcmd -U sa -P ** -S %ComputerName%\INVENTORY -i 4_create_table.sql -o "%temp%\4_sql.log"
sqlcmd -U sa -P ** -S %ComputerName%\INVENTORY -i 5_grant_user_privs.sql -o "%temp%\5_sql.log"
sqlcmd -U sa -P ** -S %ComputerName%\INVENTORY -i 6_sample_data.sql -o "%temp%\6_sql.log"


As an example of the T-SQL script, this is "1_create_database.sql" taken from the project at build 3...

( NAME = N'Inventory', FILENAME = N'c:\Program Files\Microsoft SQL Server\MSSQL10_50.INVENTORY\MSSQL\DATA\Inventory.mdf' , SIZE = 2048KB , FILEGROWTH = 1024KB )
 LOG ON
( NAME = N'Inventory_log', FILENAME = N'c:\Program Files\Microsoft SQL Server\MSSQL10_50.INVENTORY\MSSQL\DATA\Inventory_log.ldf' , SIZE = 1024KB , FILEGROWTH = 10%)
GO
ALTER DATABASE [Inventory] SET COMPATIBILITY_LEVEL = 100
GO
ALTER DATABASE [Inventory] SET ANSI_NULL_DEFAULT OFF
GO
ALTER DATABASE [Inventory] SET ANSI_NULLS OFF
GO
ALTER DATABASE [Inventory] SET ANSI_PADDING OFF
GO
ALTER DATABASE [Inventory] SET ANSI_WARNINGS OFF
GO
ALTER DATABASE [Inventory] SET ARITHABORT OFF
GO
ALTER DATABASE [Inventory] SET AUTO_CLOSE OFF
GO
ALTER DATABASE [Inventory] SET AUTO_CREATE_STATISTICS ON
GO
ALTER DATABASE [Inventory] SET AUTO_SHRINK OFF
GO
ALTER DATABASE [Inventory] SET AUTO_UPDATE_STATISTICS ON
GO
ALTER DATABASE [Inventory] SET CURSOR_CLOSE_ON_COMMIT OFF
GO
ALTER DATABASE [Inventory] SET CURSOR_DEFAULT  GLOBAL
GO
ALTER DATABASE [Inventory] SET CONCAT_NULL_YIELDS_NULL OFF
GO
ALTER DATABASE [Inventory] SET NUMERIC_ROUNDABORT OFF
GO
ALTER DATABASE [Inventory] SET QUOTED_IDENTIFIER OFF
GO
ALTER DATABASE [Inventory] SET RECURSIVE_TRIGGERS OFF
GO
ALTER DATABASE [Inventory] SET  DISABLE_BROKER
GO
ALTER DATABASE [Inventory] SET AUTO_UPDATE_STATISTICS_ASYNC OFF
GO
ALTER DATABASE [Inventory] SET DATE_CORRELATION_OPTIMIZATION OFF
GO
ALTER DATABASE [Inventory] SET PARAMETERIZATION SIMPLE
GO
ALTER DATABASE [Inventory] SET  READ_WRITE
GO
ALTER DATABASE [Inventory] SET RECOVERY SIMPLE
GO
ALTER DATABASE [Inventory] SET  MULTI_USER
GO
ALTER DATABASE [Inventory] SET PAGE_VERIFY CHECKSUM
GO
USE [Inventory]
GO
IF NOT EXISTS (SELECT name FROM sys.filegroups WHERE is_default=1 AND name = N'PRIMARY') ALTER DATABASE [Inventory] MODIFY FILEGROUP [PRIMARY] DEFAULT
GO

And, here' an example of the T-SQL script, "5_grant_user_privs.sql" taken from the project at the same build...

use [Inventory]
GO
GRANT DELETE ON [dbo].[AuditInventory] TO [InvUser]
GO
use [Inventory]
GO
GRANT INSERT ON [dbo].[AuditInventory] TO [InvUser]
GO
use [Inventory]
GO
GRANT SELECT ON [dbo].[AuditInventory] TO [InvUser]
GO
use [Inventory]
GO
GRANT UPDATE ON [dbo].[AuditInventory] TO [InvUser]
GO
use [Inventory]
GO
GRANT DELETE ON [dbo].[Collections] TO [InvUser]
GO
use [Inventory]
GO
GRANT INSERT ON [dbo].[Collections] TO [InvUser]
GO
use [Inventory]
GO
GRANT SELECT ON [dbo].[Collections] TO [InvUser]
GO
use [Inventory]
GO
GRANT UPDATE ON [dbo].[Collections] TO [InvUser]
GO

Conclusion

I haven't described the download and upload aspects yet, but I will in the near future.  Those features are interesting in and of themselves.  But for now, hopefully this will keep you entertained (or put you fast asleep).

As I've said many times before: Microsoft gives you tons of goodies to help automate almost any task.  The combination of silent installation capabilities, scripts, commands like DISM, APPCMD, SQLCMD, and the ability to string it all together in a simple BAT script, opens the door to unlimited possibilities.  You can wrap all this in a nice .MSI using InstallShield as well, but I wanted to show that it's possible to do all this with absolutely ZERO cash spent on software product licensing (besides Windows itself).  One final note, if you wrap all this in a script, be sure to run the script using "Run as Administrator".

Cheers!


Read More
Posted in command, databases, iis, installation, network administration, projects, scripting, software development, sql express, web development | No comments

Monday, 23 April 2012

Metro on Windows 8 vs Windows Server 2012

Posted on 20:19 by Unknown
I posted this on my Google+ page today and it got me thinking about "why?"

First off, the Metro implementation on Server is not the same as it is on Desktop (aka "Client" version).  Not that the theme or platform services are different, but that they support a different set of features layered on top of them.  Namely, the applets or utilities that are provided for configuring, troubleshooting and maintaining a "server" as opposed to a "client". In my opinion, for the scope of features provided within the "server" paradigm on top of the Metro UX, it is a MUCH better fit.  This almost certainly sounds absurd to say (and for you to read it, I'm sure).  "A Metro interface on a 'Server' operating system?"  Yes!  It actually works.  But here's the irony...

Metro is a better fit on "server" than on "Windows 8" by a factor of a Gazillion-Trillion-Billion to One.

I'm sorry to Windows 8 fans, but on the standard "desktop" configuration, I'm not a fan of Metro.  If I had a tablet on which to put it through some meaningful paces, I might have a very different opinion of it.  But on a traditional desktop or laptop, I still believe that the tile concept is *NOT* the most efficient UX construct for using a mouse and keyboard.  It is more efficient for hand and finger gestures.  The tiles are scaled on a factor that more ideally matches the scale and topography for direct hand movements.  A mouse is an intermediary instrument that re-scales movement and articulation such that large tiles are actually inversely proportionate to the scale of movement.  In English: From a purely engineering perspective: it is not efficient.

I relate this to comparing the handling of sugar cubes with chop sticks as opposed to using them to handle grapefruit or softballs.  At some point, the relative size ratio, and weight, make it more efficient, and convenient to use bare hands.

Those are my own words.


Read More
Posted in microsoft, windows server, windows8 | No comments

Sunday, 22 April 2012

AutoCAD and Where I'm At With It Today

Posted on 06:01 by Unknown
For some reason I get e-mails fairly often asking if I'm still "working with AutoCAD".  The answer is actually "yes", but with some qualification of the answer.  In any case, here's a brief summary of the twisted path I've taken with respect to my involvement with Autodesk products.

In short, I started out using AutoCAD, then customizing AutoCAD, then (now) deploying and maintaining AutoCAD.  Actually, replace "AutoCAD" with "Autodesk Products", because I'm dealing with Map 3D, Inventor, Civil 3D, Raster Design, Maya, 3DS Max, and more.

Strap yourself in, drink plenty of coffee, because this is going to be one of THE most incredibly boring stories you will ever read.  Don't say I didn't warn you...

The 1980's

I started around 1984 as a draftsman (now called, more politically-correct: a drafter or draftsperson), at a Naval Engineering and Design firm working on contracts involving overhaul/retrofit of U.S. Navy ships.  At that time, I was working with plastic media on Mylar film.  Occassionally with pencil or Rapid-o-graph ink on Sepia, Vellum or blue-line paper. I used to run my own blue-line prints too.  All the good stuff: a parallel bar (aka "drafting machine) with cable guides, tri-scale bar, French curve templates, flexible guides, various templates, erasers, erasing fluid, coffee, junk food, and the ubiquitous Sony (cassette tape) Walkman blasting some horrifically bad 1980's music at dangerously high volume directly into my ears.  Those were good times.

Somewhere around 1985 I was introduced to a mainframe piece of shit CAD system named AutoTroll.  I thought it sounded like a robotic creature that lived under a bridge.  I've said many times before that those days were filled with overpriced, under-powered crap products that today's AutoCAD can easily out-perform at 1/100th the price tag.

As the 1980's progressed (or degressed, depending on how you view it), we expanded into more types of design work: oil drill rigs, special-purpose nautical equipment, land-based facilities (buildings), and so on.  The work followed the economy and what was going on around the world at the time.

The 1990's, part 1

After moving to my third job in the field of Naval Engineering and Design (ok, most of those firms preferred "Naval Architecture, Engineering and Design" but I ran out of breath before getting through that), I was put through training, and began using Intergraph VDS (along with EMS, PDU/PDM) where I developed 3D models of U.S. Naval ship portions.  We never modeled an entire ship because it wasn't what we were contracted for.  Our role was overhaul/retrofit of spaces, systems, and components.  The hull form modeling was left for other companies at that time.  I moved from Intergraph to Microstation, to AutoCAD R10.  R10 was the first version I worked on.

At this point, the U.S. Navy would only allow AutoCAD to be used for "non-design" work.  Sounds confusing, doesn't it?  What that meant, was that the title sheet, bill of material, notes, and references, could be handled with AutoCAD, while the "detail design" of systems, spaces and components had to be done with one of our ever-growing arsenal of UNIX-based CAD garbage.  It was painful and frustrating. We requested over and over that the Navy reconsider, but they were adamant (and slow).

The 1990's, part 2

Soon after the Navy approved the use of AutoCAD for general systems design, we dove into with both feet and pretty much left the UNIX design tools behind for good.  We still maintained some content with them, but we migrated/translated content to AutoCAD on every opportunity we got.  It was a good move.

My employer hired a fairly-sharp MIT graduate to develop custom extensions for AutoCAD to help automate the design process for each major discipline, which back then meant: Piping, HVAC, Electrical, Electronic, Hull Design, Hull Outfitting, and Scientific/Engineering.  My area was in both Piping and HVAC systems and component design.  However, we had a fairly good rotation program, so I got to work for a period of time in each of the other departments and learned about their worlds quite a lot.  I'm sure the grammar and syntax in that previous sentence is somehow wrong, but whatever.

So, this MIT guru was a major player in the world of LISP and C++ so they dangled a big paycheck in his face and he went to work.  The results were incredible, BUT... he was contracted out of our San Francisco office, who only worked on Chevron contracts at that time. So all of his automation tools were built for that, rather than U.S. Navy ships.  That made for some frustrating work.  Our East coast teams requested features for our type of work, but we were rebuffed and ignored.  That led me to pick up a book and learn AutoLISP and DIESEL/MNU programming on my own.  I stayed late after work every night and practiced and tinkered.  Soon I began to review the MIT guy's code and make small changes to suit our needs.  That led to making more changes, and eventually just making my own set of automation tools.

Once my supervisor saw what I was doing, he asked me to push faster and get some tools ready for our department to start using.  Once I got that going, other departments asked me to do the same for their design needs.  It grew from there.  I was with that employer for about ten years.

The 1990's, part 3

A job opened up at a large defense contractor in the area (no names) looking for someone to lead the effort to implement their first "AutoCAD design environment".  I jumped at the chance and was soon involved with implementing it on their first Windows NT 3.51 LAN as well (they were on WFWG and Novell to this point, aside from the dozens of UNIX LAN's scattered around).  That meant I was put through training courses on Windows NT, and various Microsoft technologies.  Soon after, I was moved into the double-role of Windows LAN Administrator and AutoCAD Administrator for a user base of around 3,000 (that one location had over 18,000 employees, 9,000 of which were daily computer users, so I dealt with roughly a third of them).

One of the major tasks I was handed was to recreate the design automation tool development results from my previous job.  This was much bigger scale and required a more serious approach to handle the complexity of the environment, the more rigorous expectations, more focus on performance and stability, and dealing with a tiered security access control environment as well.  That meant building tools so that certain people could access higher-level features than others based on their designated role in the organization.  It was a very interesting time for me.

This was also when I began my college education in Information Science.  So I was getting a brain-full of new ideas from school and work at the same time.  My head somehow didn't explode.  Oh yeah, I was also dabbling with Cold Fusion CFML and web site development.  Fun times.

The early 2000's

After I graduated college, I left for yet another new job.  This time however, my balance of customization versus administration was shifted from roughly 70/30 to around 50/50.  I had picked up skills in Microsoft SMS 2.0 while at the previous job, and SMS 2003 was in early development at the time (I was working on the beta program by invite from a close friend, very lucky indeed).  My emphasis was at first on rebuilding a newer generation of design automation tools, but once that was done and in production, I shifted my focus towards deployment and administration.  I was also pegged to lead our company-wide migration from Windows NT 4.0 to Windows Server 2003 and Active Directory.

That meant more training, more lab time, and more use of VMware than I ever expected.  The migration went very well and I began wearing a Microsoft hat more than an Autodesk hat.  The hours I spent with Visual LISP, VLIDE, menu editing, customizing profiles, building deployments and dealing with license servers, were fading.  My new goodies were then Active Directory, Group Policy, SUS, then WUS, then WSUS, SMS 2003, then Configuration Manager 2007, VBscript, Javascript, WMI, ADSI, LDAP, COM and so on.  I also began working more with ASP web development and SQL Server backend data management.  This soup of stuff began to intertwine and soon I was building web apps that interfaced with all of these things to help manage and see what was going on throughout the environment.

All of this ramped up more when we were hit over the head with SOX compliance, and ISO certification.

As for my involvement with Autodesk products:  From 2000-2003 I was still heavily involved with customization.  I had joined ADN with a team of four others, and acted as our ADN administrator.  We went to a lot of Autodesk University conferences and we had a lot of fun.

From 2004-2006, I was building deployments, creating custom menus and profiles, dealing with FLEXlm license services, lots of phone calls with Autodesk involving technical support or licensing or contracts and purchases.  I had shifted the role of ADN lead and development lead, over to one of my team members.  He took off with it and pushed into .NET development with ObjectARX and it was awesome to see the baton passed to such an eager group who ran with it in the right direction.

I worked with Autodesk products from version 2000 and 2000i (remember that?), through 2007.

In 2007, our company was sold and split and I left for a small consulting firm.  That didn't last long, as the economy went off a cliff and they closed our fledgling office in early 2008.

After getting laid-off, I went back to the big defense contractor I worked for in the 1990's.  This time it was all about packaging and deployment of applications in general.  I shared Autodesk product duties with my buddy Dave (another Dave), but we also dealt with dozens of products from other vendors.  Our team of 14 supported over a thousand products for the organization.  It was a busy place indeed.

I left that job for another consulting firm in 2010, focusing on Microsoft platform technologies. I now had my MCITP 2008 certification and was finally diving into that world head-first.  I still bring my development skills and bag of tricks along because it always comes in handy.  I still believe that a developer in the systems engineering world has a leg up.

Late 2000's and Today

From 2008-2010:

  • Building and customizing  Autodesk  network deployments (v. 2008 through 2010)
  • Implementing and managing FLEXlm and FlexNet Manager services and reports
  • VBscript, KiXtart, CMD scripting
  • Packaging applications with Wise Package Studio and Wise Script for Altiris deployments
  • Building web applications to track and report applications inventory and licensing
  • I wrote a book or two for Amazon Kindle
  • I joined Facebook in late 2006
  • I started this blog early 2007, killed it and started it back up again
From 2010-Today:
  • Building and customizing Autodesk network deployments (v. 2010 through 2013)
  • Deploying "deployments" with System Center Configuration Manager
  • Packaging applications with InstallShield and AdminStudio
  • VBscript, PowerShell, CMD script development
  • Working with MDT 2010, WSUS, Group Policy
  • Building web applications to track and report all sorts of things
  • I wrote a few more books for Amazon Kindle
  • In early 2012 I tried to retire from this blog, but that didn't last
So, there it is.  My story, or at least part of it.  The tedious and excruciatingly boring part.  I hope it helped you get to sleep.  Cheers!



Read More
Posted in autocad, autodesk, software deployment, software packaging | No comments

Friday, 20 April 2012

Favorite Beers (this week only)

Posted on 17:06 by Unknown
Just like my previous "top 10" list, here's an updated list for this week.  The previous still stands.  These are really additions:

  1. Weyerbacher "Insanity"
  2. Dogfish Head 120 Minute IPA
  3. Duvel
  4. Chimay Grande Reserve
I know it's only four, but hey.  They're pretty darn good.



Read More
Posted in beer | No comments

The Nicest IT and IT Vendor Folks I Know

Posted on 04:13 by Unknown
I've ranted many times before how it's unfair to "hate" an entire company, without providing a rationale for it based on specific experiences and specific people.  A company, a business, an organization, is simply a group of people.  People are what give it life, character and direction. People are what provide the experience to their customer counterparts.  It's ok to be resentful of specific people who provide poor service, that's just human nature.  But hating an entire company because of one or even a few people within it, is not much different from hating an entire race or ethnicity because of experiences with only a few.

I have always had a "good" habit of commending people for doing a good job.  I've been known to call companies and tell someone's supervisor that they did an outstanding job and that I appreciated it.  It's rare, I know.  Most people cringe when you ask them to provide the name and number of their supervisor.  That's part of our American culture I suppose.

But I wanted to point out a few people I've met or had correspondence with over the years that I consider "nice" and "helpful".  These are people who are generally nice to everyone, but since I have control over this blog I can use it to name a few of them and say that I appreciate their efforts and their generosity.

Autodesk:  Shaan Hurley
Autodesk:  Frank Moore
Autodesk:  Keane Walmsley
Autodesk:  Jerry Milana
Autodesk:  Bud Shroeder
Autodesk:  Jeff Lotan
Autodesk:  Jay Tedeschi
Autodesk:  Lynn Allen
Microsoft:  Wally Mead
Knowledge Factory:  Johan Arwidmark
BetaNews:  Joe Wilcox
Self:   Kara Swisher
Self:  Mary Jo Foley
1E / MyITforum:   Rod Trent
SyCom Technologies:  Rob Spitzer
SyCom Technologies:  Jim Bezdan (going to 1E soon)
Rand / Imaginit:  Fred Stewart
Rand / Imaginit:  John Jansen
Endurance IT Services:  Everyone. I work there.  They are all nice people :)
NG IT:  Just about everyone (Steve, Dave, Lynn, Marcy, other Dave, Brian, Ronnie, Shu-Ling, Karen, and on and on)

I'm sure I missed a lot of names, but if I forgot you, I apologize.
Read More
Posted in business, people | No comments

Don't Hate Statistics. Hate Bad Statistics

Posted on 02:49 by Unknown
You probably don't like the word "statistics" much. You probably think of bland, boring and uninteresting things when you see or hear the word. But you and I are surrounded by statistics every day.

I fell in love with statistics after my first semester course back in college. I hated it in high school, mainly because my high school teacher sounded like Ben Stein and put us all to sleep with his monotonous voice. But in college, I ended up taking an extra semester (over the curriculum requirement for my degree track) because I wanted to learn more.

Two hugely important things I've learned to appreciate that I wanted to share with you are explained below. They pin the two most overlooked and often ignored aspects of statistical references you see and hear on the news all the time:

1) Statistical Results without a Clear "Margin of Error" mean NOTHING

2) "Cause and Effect" Claims are almost Always Bullshit

Here's why...

Margin of Error: It matters a lot.

If I tell you that the results of an opinion poll indicate people favor one political candidate over their opponent by 52% to 48%, you would think that's worthy data. Even 55 to 45. But if I then told you that the margin of error (a factor of sloppiness) was 10%, that means the real numbers could be off by 10% in EITHER direction. Net result? It's a tie. Even at 5% MOE you could consider it a tie, for the 52-48 numbers anyway.

If that were applied to mapping directions and I told you that the odds of the destination being correct were 90%, that would mean 1 out of every 10 times you'd end up in the wrong place. Not very good, is it?

Cause and Effect

One of the most common uses of statistics is to establish a claim of "cause and effect". You hear it all the time. Things like "people with red hair buy blue cars more often" and so forth. Or "people who own a Lexus have kids who enter college at a higher rate than those who don't.

But does that mean that owning a Lexus is the reason? Or is it the income level of those who own them? And how was the sample data collected? From one geographic area? Was it based on survey cards supplied by the dealerships? How accurate is that? How many people were surveyed? 10? 100? 1,000? 100,000? One?!

Do you see where I'm going with this? So often we let this crap seep into our eyes and ears, but we don't stop to ask how it was derived. You wouldn't do that with food or medicine. You'd apply some rationale as to the safety and assurance before risking the poisoning your body. But allowing bullshit statistical claims into your brain poisons your thinking. Don't do it! If food isn't cooked or packaged properly, you don't eat it (hopefully). If statistical claims aren't provided with sample basis and margin of error, call "bullshit!" and ignore them as well. Life is too short.

Namaste
Read More
Posted in Statistics news marketing | No comments
Newer Posts Older Posts Home
Subscribe to: Posts (Atom)

Popular Posts

  • Voting Time: Help Me Out?
    I need to get a better view of how I should manage this blog if I'm going to keep at it. I'd like to know how you typically discover...
  • A World Without Competition
    Try to imagine what things would be like today had there not been fierce competition in certain key parts of our world.  I’ll give you some ...
  • Book Update
    I posted some gibberish a few weeks ago about another book project.  Well, I'm getting close to wrapping it up, so I thought I'd go ...
  • Cost
    Software technology, like any technology, provides a means to solving problems.  Some big. Some small.  Some that help.  Some that hurt.  An...
  • Windows 7: Default User vs All Users
    A lot of confusion seems to occur with understanding the difference between the "Default User" profile, and the "All Users...
  • Time to Give Props
    With the ever-expanding volume and breadth of information on the Internet today, it's easy to focus on my own thoughts, experiences, ide...
  • Table of Contents (Preliminary)
    Here's the preliminary Table of Contents for my new book "The AutoCAD Network Administrator's Bible - 2013 Edition".  I...
  • The Nicest IT and IT Vendor Folks I Know
    I've ranted many times before how it's unfair to "hate" an entire company, without providing a rationale for it based on s...
  • Windows 8
    Two small, yet irritating things, that I hope Windows 8 addresses with respect to Windows 7: Being able to put the Recycle Bin in the S...
  • Stupid Assumptions
    After years of watching sci-fi TV shows, movies, etc. it's finally come to a point where even the so-called brightest of our authors and...

Categories

  • a
  • activation
  • active directory
  • advertising
  • agile
  • agility
  • amazon
  • american
  • apple
  • application virtualization
  • applications
  • art
  • articles
  • asp
  • augi
  • authors
  • autocad
  • AutoCAD Autodesk
  • autodesk
  • autolisp
  • automation
  • automotive
  • backups
  • batch
  • beer
  • beta
  • blackberry
  • blogs
  • bongloads
  • book
  • books
  • Books writing kindle amazon technology business projects
  • browsers
  • business
  • cad
  • career
  • certification
  • chrome
  • city government
  • civilization
  • cloud services
  • cmd
  • cmmi
  • comedy
  • command
  • community
  • computers
  • conferences
  • config manager
  • consultants
  • consulting
  • contracting
  • cranium drainium
  • crapware
  • culture
  • data center
  • data mining
  • databases
  • deployment
  • directx
  • DLL
  • domains
  • dumb
  • earth
  • economy
  • editor
  • education
  • election
  • elections
  • employment
  • engineering
  • entertainment
  • environment
  • error monitoring
  • events
  • exchange
  • facebook
  • family
  • firefox
  • flexnet
  • fud
  • fun
  • funny
  • games
  • gary vaynerchuk
  • gmail
  • google
  • government
  • group policy
  • hampton roads
  • health
  • history
  • holidays
  • home
  • html5
  • humor
  • hyper-v
  • iis
  • industry
  • infrastructure
  • installation
  • installshield
  • internet
  • internet explorer
  • interviews
  • jobs
  • jtbworld
  • kindle
  • kixtart
  • lab setup
  • languages
  • ldap
  • learning
  • legal
  • licensing
  • life
  • lifecycle
  • linux
  • lisp
  • logging
  • management
  • manufacturing
  • marketing
  • markets
  • mdop
  • mdt
  • medical
  • messaging
  • microsoft
  • microsoft access
  • military
  • mountains
  • movies
  • mozilla
  • music
  • nature
  • network administration
  • news
  • nook
  • nothing
  • office
  • open source
  • openoffice
  • opera
  • operating systems
  • oracle
  • osx
  • packaging
  • patches
  • people
  • photos
  • podcasts
  • policy
  • politics
  • powershell
  • predictions
  • process automation
  • products
  • programming
  • projects
  • psychology
  • publishing
  • rail
  • reading
  • registry
  • religion
  • reporting
  • reviews
  • rsat
  • rss
  • safari
  • safety
  • sales
  • satire
  • sccm
  • scheduling
  • science
  • scripting
  • search
  • security
  • servers
  • services
  • sharepoint
  • shopping
  • sms
  • social stuff
  • society
  • softgrid
  • software assurance
  • software deployment
  • software development
  • software packaging
  • sony
  • speaking
  • sports
  • sql express
  • sql server
  • statistics
  • Statistics news marketing
  • steve jobs
  • stories
  • stuff
  • stupidity
  • symantec
  • sysinternals
  • system center
  • systems architecture
  • t-sql
  • taxes
  • technet
  • technical support
  • technology
  • TED
  • ted talks
  • testing
  • textpad
  • thoughts
  • traffic
  • training
  • transportation
  • travel
  • troubleshooting
  • tutorials
  • twitter
  • ubuntu
  • unattend
  • unemployment
  • updates
  • upfront ezine
  • utilities
  • vacation
  • vba
  • vbscript
  • video
  • virginia
  • virginia beach
  • virtualization
  • visual lisp
  • vmware
  • vmware server
  • voting
  • war
  • weather
  • web
  • web browsers
  • web development
  • web sites
  • windows
  • windows 7
  • windows live
  • windows server
  • windows server 2012
  • windows8
  • winpe
  • wise
  • wmi
  • work
  • writing
  • ws08
  • wsus
  • wwa
  • x64
  • xml
  • ze frank

Blog Archive

  • ▼  2013 (37)
    • ▼  October (1)
      • 10 Questions: With Ralph Grabowski
    • ►  September (5)
    • ►  August (8)
    • ►  July (2)
    • ►  June (4)
    • ►  May (4)
    • ►  April (2)
    • ►  March (2)
    • ►  February (8)
    • ►  January (1)
  • ►  2012 (120)
    • ►  December (14)
    • ►  November (12)
    • ►  October (10)
    • ►  September (7)
    • ►  August (3)
    • ►  July (2)
    • ►  June (6)
    • ►  May (6)
    • ►  April (20)
    • ►  March (16)
    • ►  February (18)
    • ►  January (6)
  • ►  2011 (343)
    • ►  December (15)
    • ►  November (23)
    • ►  October (27)
    • ►  September (35)
    • ►  August (29)
    • ►  July (17)
    • ►  June (23)
    • ►  May (20)
    • ►  April (38)
    • ►  March (61)
    • ►  February (54)
    • ►  January (1)
Powered by Blogger.

About Me

Unknown
View my complete profile