Windows 10

Oct. 19th, 2014 09:13 pm
mellowtigger: (Daria)
For the last few weeks, I've been using the Windows 10 beta on my home pc. The short review:

It's not as awful as Windows 8, so I guess I could use it.

Yes, they skipped Windows 9. The most plausible explanation is that many programs may make the mistake of looking for earlier versions of windows with a search string similar to "Windows 9*", which would find instances of Windows 95 and Windows 98. Skipping ahead to Windows 10 will avoid that problem scenario where a program thinks it's running on an old system instead of a new one.

I upgraded my existing Windows 7 to Windows 10, so the process took a long time with a lot of reboots. Afterwards, performance was awful. It was really, really slow. I started disabling services that were hogging disk access. It was automatically defragging, it was monitoring usage to "optimize" disk access... and I turned it all off if I noticed an active service that I didn't immediately need. Finally, performance was back up to Windows 7 speeds.

I started up powershell. I checked, and, it's running at version 5 now.  Yay!  I tried out the new command that linux people will recognize from Debian's "apt-get". It allows a 1-line command that will install a software program onto your computer from a centralized host. It makes software management much easier on linux, and now Microsoft has caught up. There weren't any packages available yet that I was interested in using... but the available libraries will grow with time. This feature is very nice!

Import-Module OneGet

grey slider on grey scrollbarThe interface is still a little goofy, and I still like Windows 7 better. I can tolerate this new one, at least. The only interface issue that gave me trouble was my difficulty in detecting where the slider is located on a scrollbar. The whole thing is soft grey, and one grey is not sufficiently distinct from another grey. Click the picture on the right to see for yourself. It's a screenshot from my desktop.  I have to look for a moment before my eyes finally detect the slider in the scrollbar.

I decided to provide feedback to Microsoft about the scrollbar. Bad move! Doing so somehow tied my profile (which was previously a standalone account) to my Microsoft online account (which I never wanted). It also activated OneDrive and started syncing files on my pc to Microsoft's servers (which I definitely never wanted). I disabled it immediately:

mmc.exe, then add/remove snapin: Local Computer Policy (or run gpedit.msc)
Computer Configuration\Administrative Templates\Windows Components\OneDrive
edit "Prevent the usage of OneDrive for file storage"
set state to Enabled

I also took back control of my profile by disconnecting the Microsoft online account:

PC Settings / Users and accounts / Your profile

I've decided that Windows 10 is passable. There's no compelling reason to upgrade to Windows 10... except the eventual shutdown of support for Windows 7 by Microsoft. Welcome to the future. Hurrah?
mellowtigger: (penguin coder)
None of those annoying animations were ever supposed to exist on Windows 7. There was such an outcry, though, that Microsoft created a hotfix that allowed support of their deprecated animation protocol known as Microsoft Agents. Doing a little more digging, I found sources for all of their original agents.

Read the comments prior to each section to find the urls you need to visit to download all of the relevant files. Each of them should produce a file of .ACS extension in the C:\WINDOWS\MSAGENT\CHARS\ directory.  This Powershell script will demonstrate all of them.

# Demonstration of Microsoft Agent controls.
# NOTE: These agents are deprecated.  They were never supposed to be on Win7.

# The primary Agent control can be downloaded for Windows 7 as a hotfix.
# It will install c:\windows\msagent\chars\merlin.acs, plus other supporting files.
# Download the hotfix from this url:

$MI6 = new-object -com agent.control.2
$MI6.connected = $true

$agent = $MI6.characters.character('Merlin')
$agent.think('Hello!  I''m over here!')
$agent.think('I and the other Microsoft Agents are here.')
$agent.think('I am Merlin.')
$agent.think('Only 1 agent can be loaded at a time.')
while ($agent.visible) { sleep -Milliseconds 100 }

# You can download the other official agents at these urls.
# Genie  @
# Robby  @
# Peedy  @
# Clippy @
# list of their animations @

$agentlist = @(('Genie','Genie.acs'),('Robby','Robby.acs'),('Peedy','Peedy.acs'),('Clippy','Clippit.acs'))
foreach($name in $agentlist.GetEnumerator()) {
    $MI6.Characters.load($name[0], $name[1])
    $agent = $MI6.characters.character($name[0])
    $agent.think('Hi, I''m '+$name[0]+'.')
    while ($agent.visible) { sleep -Milliseconds 100 }

# A Microsoft Office agent no longer downloadable from Microsoft (that I could find).
# Not all animations from above are available on this character.
# Rover @

$agent = $MI6.characters.character('Rover')
$agent.think('Hi, I''m Rover.')
sleep -seconds 6

# Non-Microsoft agents that people have developed:

I call this script Clippy.ps1.  I loathed Clippy passionately.  Now, finally, he's my puppet.  Come to the dark side; feel the power(shell)!
mellowtigger: (penguin coder)
I wrote a Powershell script to convert my Firefox browser bookmarks into shortcut files on my hard drive. I was getting too large a collection, and saving new bookmarks was finally taking obviously more-than-zero time to process. From now on, I'll just drag shortcuts into folders for easier sorting and searching.

Powershell V3 includes new JSON commands, but they didn't work because my data store was too large. I had to use a 3-line command instead of a 1-line command to do the same thing, so it was an easy workaround. I retained the same directory structure and creation timestamp as in my bookmarks, too. I didn't bother with webpage icons, but the script could be upgraded to handle those too.

# Convert bookmarks from Firefox browser JSON backup to Windows 7 filesystem
# This script works in PowerShell v3.

# configure these variables as needed
$sourcefile = 'p:\www\bookmarks-2013-02-16.json'
$destfolder = 'p:\www\bookmarks\'
$foldertype = 'text/x-moz-place-container' # specific to Mozilla Firefox browser

# unix epoch time is milliseconds since 1970-01-01 00:00:00 GMT
$epoch  = [datetime]'1970-01-01'

# i found that a delay was necessary, otherwise errors cropped up occasionally
$delay = 10 # milliseconds

function CheckNodeType([object]$node, [string]$parent) {
    if ($node.type -eq $foldertype) {
        ProcessDirectory $node $parent
    } else {
        ProcessBookmark $node $parent

function ProcessDirectory([object]$node, [string]$parent) {
    # remove bad characters from directory names
    $fixname = [system.text.regularexpressions.regex]::replace($node.title,"[^1-9a-zA-Z._]","")
    $dir = $parent + $fixname + '\'
    if (-not (test-path $dir)) {
        new-item -name $fixname -path $parent -itemtype 'directory'
        $offset = $node.dateAdded / 1000000
        $time   = [timezone]::CurrentTimeZone.ToLocalTime($epoch.AddSeconds($offset))
        sleep -Milliseconds $delay
        Set-ItemProperty -path $dir -name CreationTime -value $time
    foreach($subnode in $node.children) {
        CheckNodeType $subnode $dir

function ProcessBookmark([object]$node, [string]$parent) {
    # remove bad characters from filenames
    $fixname = [system.text.regularexpressions.regex]::replace($node.title,"[^1-9a-zA-Z. _]","")
    # truncate excessive filename lengths
    if ($fixname.Length -gt 200) {
        $fixname = $fixname.substring(0,200)
    # process the good filename
    $file = $parent + $fixname + '.url'
    if (-not (test-path $file)) {
        $data = "[InternetShortcut]`nURL="+$node.uri
        new-item -name ($fixname + '.url') -path $parent -itemtype 'file' -value $data
        $offset = $node.dateAdded / 1000000
        $time   = [timezone]::CurrentTimeZone.ToLocalTime($epoch.AddSeconds($offset))
        sleep -Milliseconds $delay
        Set-ItemProperty -path $file -name CreationTime -value $time
        Set-ItemProperty -path $file -name LastWriteTime -value $time

# read the JSON-formatted text string from the bookmark backup file
$json  = get-content $sourcefile

# the native ConvertFrom-Json command does not work on very large JSON stores, so use this method instead
[Reflection.Assembly]::LoadWithPartialName("System.Web.Extensions") | out-null
$jsser = New-Object System.Web.Script.Serialization.JavaScriptSerializer
$jsser.MaxJsonLength = $json.Length
$tree  = $jsser.DeserializeObject($json)

# process the object tree
$menu  = $tree.children | where { $_.title -eq 'Bookmarks Menu' }
foreach($node in $menu.children) {

    # ignore management folders
    if (($node.title -ne 'Recently Bookmarked') -and 
        ($node.title -ne 'Recent Tags') -and 
        ($node.title -ne 'JSTOR liberator')) {

        CheckNodeType $node $destfolder

It took several more hours than I expected to get that script working, but it works well.  I like my bookmark folders.  I expect to do some manual work to create a better sorting nomenclature for them.
mellowtigger: (penguin coder)
Powershell can be used with Internet Explorer to "screen scrape" websites. Launching these scripts through the task scheduler can be complicated to setup, especially if the destination url is a secure protocol website.  I wanted to avoid making the batch logon a local machine administrator, so I found the following security changes necessary to get the script to work properly.

1) Update the logon authority

1. The end user account should not be an administrator account, just a regular domain (or local) user.
2. Run the local security policy editor (c:\windows\system32\secpol.msc)
3. Navigate to Security Settings / Local Policies / Security Options
4. Click "Log on as a batch job", add the userid or security group
Preferably, use a domain security group designed specifically for this purpose.

2) Update the COM policy

1. Run the DCOM policy editor (c:\windows\system32\dcomcnfg.exe)
2. Navigate to Console Root / Component Services / Computers / My Computer
3. Right-click on My Computer, select Properties
4. Navigate to COM Security tab.
5. Click "Edit Default" for Launch and Activation Permissions
6. Add the userid or group and Allow both "Local Launch" and "Local Activation".

3) Disable user IE enhanced security

1. Run the server manager (Start / Computer / right-click, Manage)
2. Click to select the top entry "Server Manager (servername)"
3. Click link on right panel for "Configure IE ESC"
4. toggle OFF for Users

4) Update the IE settings

1. Run the Internet Options editor (Control Panel / Network and Internet / Internet Options)
2. Navigate to Security tab
3. Make sure you are viewing the "Internet" zone
4. UNcheck (if using IE8 or prior) "Enable Protected Mode".
5. View the "Trusted Sites" zone
6. UNcheck "Enable Protected Mode"
7. click "Sites" button
8. add your destination domain to the list

If you fail to perform these steps, then the powershell variable used to interact with IE will "break".  Either IE will fail to launch (possibly due to the choice of default webpage to load on initialization), or powershell will be unable to interact with the browser that launches.  OS security features will "break" the handle to the Document Object Model variable in IE, leaving empty values instead.
mellowtigger: (penguin coder)
Can a powershell script determine if it's running in an interactive environment or not? I have a few methods that get close to the answer, but nothing offers satisfying certainty yet.

$host.namepowershell.exe yields 'ConsoleHost'
powershell_gui.exe yields 'Windows PowerShell ISE Host'
Each program, however, can operate in either batch or interactive conditions.

$process = gwmi win32_process -filter "ProcessID=$pid"
$batch = $process.commandline -match "-NonInteractive"

If someone programs the scheduled task with the "-NonInteractive" parameter, then the script can successfully read that parameter.  It's not the default, though.
I always find the state info as 'Opened'. In gui conditions, availability has always been 'Busy'. In batch conditions, though, it is 'Available'.  Can I rely on this indicator?

I need to know, because I want to separate hardcoded userid/password combinations from my scripts.  I want to put them into the parameter of the scheduled task instead.  When testing those same scripts interactively, though, I want the script to prompt me for the data that it didn't receive through parameters.

Any script stuck at a user prompt is a Bad Thing [tm] when running in batch mode, since no one will ever answer it.  Using the '-NonInteractive' parameter is a good idea for all batch PowerShell scripts for this reason.  It seems to just suppress any command (or the partial execution of it) that would require user input.  I didn't even know this parameter existed until today.  It would have prevented an error in a production script that got stuck prompting for permission to delete a subfolder that I never expected to be there.  My script would have skipped that subfolder and gone on its merry way doing other things.

It still doesn't answer my question, though.  Is there a reliable way to tell if my script is running interactively or not?  Yes, it's a bit like asking which kind of virtual reality I live in, but sometimes it is important to know the distinction.  :)  Does anyone have the answer?

edit 2012.10.19: I forgot to mention previously, but someone on g+ pointed me to a webpage where contributors mentioned a few ways of dealing with this problem.  Among the solutions was one that works:  [environment]::userinteractive.
mellowtigger: (penguin coder)
Someone on Google Plus asked for a quick program. I couldn't format the text properly on G+, so I'm posting it here instead.

# Save this text using Notepad to a file named recordsplit.ps1
# Right-click the new recordsplit.ps1 file, then select "Run with PowerShell"
# If it doesn't run, you need to enable powershell scripts...

# This script separates a text file into other text files, based on the delimiter text.
# Change the first 3 variables as needed.
# If more than 999 records will be used, be sure to change the "D3" in the $newfile to an appropriate digit length ("D4")


$counter = 0
$fileitem = gci $file
foreach ($line in (get-content $file)) {
    if ($line -like $delimiter) {
        $counter += 1
        $newfile = $fileitem.directoryname + '\' + $fileitem.basename + '.' + ("{0:D3}" -f $counter) + $fileitem.extension
        set-content -path $newfile -value $null
    if (($line -ne $null) -and ($line -ne '')) {
        if (($keepdelimiter -eq $true) -or ($line -ne $delimiter)) {
            add-content -path $newfile -value $line

Enjoy.  :)
mellowtigger: (penguin coder)
phone operatorThe more I learn about phone system technology, the more I wonder how anything ever works (successfully).

Among the hodge-podge of responsibilities I have at my job, I also manage 6 PBX phone systems of the Mitel / Intertel CS-5000 family. Thanks to a useful hint that my boss found in a Perl script via Google, I was able to establish phone system logging using Powershell. My programming will save us over $10,000 that we would have been charged by our vendor. That's half my annual salary, so I've earned my keep for a while. :)

I've redacted some of the details from my code for security reasons, but I wanted to make the source code public so it's easier for the next person to find useful material through a web search engine.

# log-smdr.ps1
# This Powershell script connects to a Mitel/Intertel CS-5000 series phone system, 
# directs it to open its SMDR stream, then logs the results to a text file.
# Station Message Detail Recording (SMDR) is a system feature that provides 
# a detailed record of outgoing and incoming calls.
# This script is intended to be restarted at midnight each night.

# Turning on SMDR reporting is a two-step process in DbStudio Session Manager.
# 1) System / Maintenance / SMDR, set "SMDR Output Active" to yes
#    ("Output Port" should already be set to "NONE")
# 2) System / Sockets / SMDR, set "Enable" to yes
#    (System / Cabinet / Sockets / SMDR, in some older systems)

# (example perl script)

param ([string]$site='needparm', [string]$logdir='C:\script\log\', [boolean]$debug=$false)

# convert $site parameter into ip/port combination
switch ($site) {
    'S1' { $ip=''; $port=4000 }
    'S2' { $ip=''; $port=4000 }
    'S3' { $ip=''; $port=4000 }
    default { $ip=''; $port=4000 }

# find a new filename for logging this session
$scriptps1 = $MyInvocation.MyCommand.Name
$scriptname = $scriptps1.substring(0,($scriptps1.length - 4))
$datelog = get-date -f 'yyyyMMdd'
$version = 0
do {
    $version += 1
    $filelog = $logdir + $scriptname + '.' + $datelog + '.' + $site + '.' + $version + '.txt'
} while (test-path $filelog)
set-content $filelog $null
if ($debug) {write-host 'logfile:' $filelog}

# create a tcp socket and a destination
# fyi, the TcpClient object did not work with SMDR (perhaps some default values interfered?)
# so I took it down a layer to the Socket object which works nicely
$sockfam  = [System.Net.Sockets.AddressFamily]::InterNetwork
$socktype = [System.Net.Sockets.SocketType]::Stream
$sockpro  = [System.Net.Sockets.ProtocolType]::TCP
$socket   = New-Object System.Net.Sockets.Socket $sockfam, $socktype, $sockpro
$ipaddr   = [System.Net.IpAddress]::Parse($ip)
$ipdest    = New-Object System.Net.IPEndPoint $ipaddr, $port

# establish the connection
if ($debug) {write-host 'connecting:' $ip ':' $port}
if ($socket.Connected -eq $false) {
    if ($debug) {write-host 'connection: FAILED'}
if ($debug) {write-host 'connection: OPEN'}

# send the SMDR command ASCII character string (no password)
#     2  	002	02	00000010	STX	Start of Text
#     0  	000	00	00000000	NUL	Null char
#     132	204	84	10000100	„	Double low-9 quotation mark
$cmd = @([byte]2,[byte]0,[byte]0,[byte]0,[byte]132,[byte]0)
if ($debug) {write-host 'connection: CMD SENT'}

# the socket's .Receive() will Block, causing the program to freeze until bytes are received,
# so set up a stream which provides the useful .DataAvailable property instead
# incoming records are always 86 bytes long
$bytes = new-object System.Byte[] 86
$stream = new-object System.Net.Sockets.NetworkStream($socket)

do {
    start-sleep -m 1000
    if ($stream.DataAvailable) {
        $socket.Receive($bytes) | out-null  # script hangs here until bytes are received
        $result = [System.Text.Encoding]::ASCII.GetString($bytes[0..84]) # ignore trailing CR/LF
        if ($debug) {write-host $result}
        add-content $filelog $result # append data to text log file
    $datetest = get-date -f 'yyyyMMdd'
} while ($datetest -eq $datelog)

if ($debug) {write-host 'connection: CLOSING'}
if ($debug) {write-host 'connection: CLOSED'}
$socket = $null
You turn on SMDR by using the DbStudio Session Manager software that comes with your PBX system. That application directory also contains \DiagMon\Diagmon.exe, a program that can also be used to monitor (but not log, that I can tell) SMDR output from your phone panels. I intend to use DiagMon with packet sniffing software to reverse engineer the protocol for handling password-enabled SMDR. It's really hard to come by programmer-focused PBX documentation. I even contacted Mitel, but they defer to local vendors who have a financial disincentive to provide such documentation. :(

Here's a redacted version of the debug output from this script:
connecting to : 4000
connection: OPEN
connection: CMD SENT
Q   Station Message Detailed Recording                          17:01:24 04-07-2012
R   NET 1234  P0001 94567                        17:11 00:03:34 $00.00
R   NET 1234  P0001 95678                        17:16 00:00:09 $00.00
R   NET 2345  P0001 96789                        17:21 00:00:14 $00.00
R   IN  3456  90001                              17:19 00:04:56 $00.00
My next project will take these .TXT log files, parse them, and store the data fields in a SQL Server table. Then I can do lots of fun stuff like calculate the average length of time that people are spending in our phone tree before they reach a live person, or determine if people are hanging up in frustration before reaching any person.
mellowtigger: (penguin coder)
Good night, GracieI learned yesterday how to write SQL Injection attacks while learning how to code against them during data import. Vandalizing databases is surprisingly easy, as easy as writing email spam that pretends to be from anybody.

The mischief is similar to the old George Burns and Gracie Allen vaudeville act, where information that was intended as just a clause to a command is instead misinterpreted to be the actual command. The SQL Injection attack is constructed so as to cause a SQL database to misinterpret data in exactly the same way.

Here is the import data file that I constructed to cause the problem.

Name Addr

The single quote near the beginning is necessary so as to prematurely close the quotation that was intended to mark the end of a data field. The close parenthesis and semicolon finish out the proper syntax of the intended data import command. Next comes the new command that I want to "inject" as another valid SQL command. The double-dash at the end is also necessary, because it forces the database to ignore as a comment any additional text that comes afterwards, thereby masking the trailing characters from my original import command.

It really is that simple. Even xkcd made fun of the ease with which disastrous consequences can be implemented.

little Bobby Tables

Here is the Powershell code that I used to take advantage of the SQL Injection attack during data import.
$conn = new-object ('Data.SqlClient.SqlConnection') 'Server=DBSERVERNAME;Integrated Security=True'
$cmd = new-object System.Data.SqlClient.SqlCommand
$cmd.Connection = $conn

$input = import-csv g:\directory.path\sql1.injection.csv
foreach($row in $input) {
      $p1 = $
      $p2 = $row.addr
      $sql = "INSERT INTO reporting.dbo.SampleTable (FullName, FullAddr) VALUES ('$p1','$p2');"
      $cmd.CommandText = $sql

      $out = $cmd.ExecuteNonQuery()

$conn = $null
And here is the code I used to replace the middle section so it is safe from SQL Injection.
$input = import-csv g:\directory.path\sql1.injection.csv
foreach($row in $input) {
      $p1 = $
      $p2 = $row.addr
      $sql = 'INSERT INTO reporting.dbo.MasterID (FullName, FullAddr) VALUES (@p1,@p2);'
      $cmd.commandtext = $sql

      foreach($key in $p0.keys) {
      $out = $cmd.ExecuteNonQuery()
I used SQL Parameterization to separate the clauses (data fields) of my SQL statement from the actual SQL command itself. I used a powershell hash table to make the dictionary index for those data field values.

I tested it out, and it worked. The first coding sample caused the database to delete the test table. The second coding sample resisted the SQL Injection vandalism, and put the data values into the data fields as intended.


Apr. 17th, 2008 07:49 am
mellowtigger: (Default)
Take note: This may be the only good thing that I ever say about Microsoft products.

I like Windows PowerShell.

I downloaded it last night and played around with it a bit.  Instead of manipulating text strings like other shells, this one manipulates objects.  I suppose that various operating systems are just too different to come up with a common command set to make a cross-platform list of shell commands.  Still, I like it even if it is Windows-specific.


mellowtigger: (Default)

September 2017

1718192021 2223


RSS Atom

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Sep. 23rd, 2017 01:58 am
Powered by Dreamwidth Studios