mellowtigger: (penguin coder)
2024-03-17 05:05 pm
Entry tags:

Canvas API

I've mentioned before that I can feel my brain starting to think again, thanks to an extended reprieve from the very busy months before now. As further evidence of this improvement, I spent my spare time between tickets today actually learning something!

Many places use Canvas as their educational platform. Both Penn State and the University of Minnesota do, for example. Today, I learned how to use PowerShell to pull information from the Canvas platform. It was surprisingly easy, after I figured out that they provided very easy access to the text token you use to interact with their servers. They provide API access to all students, for example.

After I manually generated my token, it was easy to insert "api/v1/" into the url that I would normally use at the webpage for their gui platform, then use my script instead. The Course ID is just the part of the url that shows whenever you access that class through the webpage (and not the SIS ID as I first assumed). Here's a sample of the short code, with a few key text strings redacted for security safety.

$headers = @{ Authorization='Bearer [redacted]}
$idCourse1 = '[redacted]'
$uri=”https://[redacted].instructure.com/api/v1/courses/$idCourse1/discussion_topics"
$response = Invoke-RestMethod -Method 'Get' -URI $uri -Headers $headers -ContentType application/json
$response | Select-Object posted_at, title, message | Out-GridView

And that brief code showed me a list of all the Discussions in my course sandbox.

It's nice to be able to think again.

mellowtigger: (money)
2022-12-27 11:07 am

the pandemic churn at Meetup

Over the years, I've been the online leader for 2 different groups at Meetup.com. It's a great place to organize calendars of activity, and it's free to use for attendees. They charge meeting organizers for the service, but it's way too expensive for what it costs Meetup to provide that service. They charged US$150/year for each group, the last I knew a year ago. That's too rich for my blood, so I eventually stopped paying both of them.

Twin Cities Powershell Automation User Group is looking for a new leader, but it may not get one. It'd be a shame to lose it. I created that group back in 2014 October 14. (I got the date from the earliest copy at the wayback machine.) The powershell group had previously used EventBrite for RSVPs, but it wasn't reaching a large audience. Attendance definitely improved after switching to Meetup. I paid for that group for several years, before handing off the finances of it to Tim Curwick, the group president at the time. I hope it finds a new leader, but it seems unlikely. This group focuses on in-person meetings (with pizza!) rather than online meetings, which are available everywhere. There are probably more people like me who simply don't attend group events any more due to health risks.

Star Citizens of Minnesota found someone else to take over, which is good. I paid the bills for a few years, which is too much money gone. But I still haven't attended any of their indoor events since the pandemic started. This group is based on the Star Citizen game which has been in development since 2012, and still isn't in release yet, although some functions are playable (if you're on Windows).

I received the notice today that Twin Cities EcoClub is also looking for a new leader. I attended only one of their events. We listened to the person organizing the hydroponics container at the North Market grocery. (Aside: I know I edited photos of that event. Did I really never post it? I certainly can't find it now in Dreamwidth searches.) I remember that one was post-pandemic, because we stayed in a large spaced-apart circle outdoors to listen.

I know I've seen notices of other groups losing their leadership. Meetup will eventually delete the groups if nobody else steps up to pay the bills. It's sad to see the groups disappear, but maybe Meetup will finally stop "printing money" and reduce their rates (by a lot) to something more reasonable.

mellowtigger: (penguin coder)
2021-06-16 11:33 pm
Entry tags:

installing powershell and DSC on Linux Mint

I didn't find anything online that completely addressed how to install the latest scripting toys on my Linux Mint system. Microsoft (to their credit) did provide lots of useful material that made it relatively painless. I just wanted to record those details here for posterity's sake, in case somebody else on the internet gives it a whirl later and could use some proverbial bread crumbs.

PowerShell version 7 is based on the new open-source .NET library. Yes, Microsoft is using open source for one of their most valuable intellectual properties.
  1. Figure out what version of Ubuntu you're using, which will affect which installer package you need.
    1. more /etc/os-release
      It shows you info about the current distribution, which in my case is Linux Mint 20.1.
    2. more /etc/upstream-release/lsb-release
      It shows you info about the "parent" distribution, which in my case is Ubuntu 20.04.
  2. Based on that Ubuntu version, choose the appropriate instructions on this Microsoft page:
    https://docs.microsoft.com/en-us/powershell/scripting/install/installing-powershell-core-on-linux?view=powershell-7.1
  3. That's good enough for a basic shell interface, but let's go fancy with the full Visual Studio Code experience. Follow their instructions again:
    https://code.visualstudio.com/docs/setup/linux
  4. Launch "code", then change it to the old fashioned powershell ISE experience.  I also approve of using text editors like Notepad on Windows, but I feel the urge tonight to modernize.
    1. Call up the Palette within Visual Studio Code by pressing CTL-ALT-P.
    2. Click on the listing for "PowerShell: Enable ISE Mode"
  5. In preparation for installing Desired State Configuration on Linux, you will probably need to update some packages like I did that depend on your version of OpenSSL. I didn't know about it, and I had to follow a nested dependency to solve it. To find out your system's version, use this straightforward command:
    1. openssl version
  6. Choose the correct package of Open Management Infrastructure (their version of the free cross-platform standard of CIM) to install, based on that OpenSSL version:
    https://github.com/Microsoft/omi
  7. Choose the correct package of Desired State Configuration to install, based on that OpenSSL version:
    https://github.com/Microsoft/PowerShell-DSC-for-Linux/releases/tag/v1.2.1-0

I'm pretty sure those are all of the tools that the cool kids are using, although DSC seems to go by other names in an Azure environment.  That part still confuses me greatly.  Anyway, someday after I've used it to do something interesting on my computer, I'll be sure to post some code samples.
mellowtigger: (Daria)
2014-10-19 09:13 pm
Entry tags:

Windows 10

For the last few weeks, I've been using the Windows 10 beta on my home pc. The short review:

It's not as awful as Windows 8, so I guess I could use it.

Yes, they skipped Windows 9. The most plausible explanation is that many programs may make the mistake of looking for earlier versions of windows with a search string similar to "Windows 9*", which would find instances of Windows 95 and Windows 98. Skipping ahead to Windows 10 will avoid that problem scenario where a program thinks it's running on an old system instead of a new one.

I upgraded my existing Windows 7 to Windows 10, so the process took a long time with a lot of reboots. Afterwards, performance was awful. It was really, really slow. I started disabling services that were hogging disk access. It was automatically defragging, it was monitoring usage to "optimize" disk access... and I turned it all off if I noticed an active service that I didn't immediately need. Finally, performance was back up to Windows 7 speeds.

I started up powershell. I checked, and, it's running at version 5 now.  Yay!  I tried out the new command that linux people will recognize from Debian's "apt-get". It allows a 1-line command that will install a software program onto your computer from a centralized host. It makes software management much easier on linux, and now Microsoft has caught up. There weren't any packages available yet that I was interested in using... but the available libraries will grow with time. This feature is very nice!

powershell
Import-Module OneGet
Find-Package

grey slider on grey scrollbarThe interface is still a little goofy, and I still like Windows 7 better. I can tolerate this new one, at least. The only interface issue that gave me trouble was my difficulty in detecting where the slider is located on a scrollbar. The whole thing is soft grey, and one grey is not sufficiently distinct from another grey. Click the picture on the right to see for yourself. It's a screenshot from my desktop.  I have to look for a moment before my eyes finally detect the slider in the scrollbar.

I decided to provide feedback to Microsoft about the scrollbar. Bad move! Doing so somehow tied my profile (which was previously a standalone account) to my Microsoft online account (which I never wanted). It also activated OneDrive and started syncing files on my pc to Microsoft's servers (which I definitely never wanted). I disabled it immediately:

mmc.exe, then add/remove snapin: Local Computer Policy (or run gpedit.msc)
Computer Configuration\Administrative Templates\Windows Components\OneDrive
edit "Prevent the usage of OneDrive for file storage"
set state to Enabled

I also took back control of my profile by disconnecting the Microsoft online account:

PC Settings / Users and accounts / Your profile
Disconnect

I've decided that Windows 10 is passable. There's no compelling reason to upgrade to Windows 10... except the eventual shutdown of support for Windows 7 by Microsoft. Welcome to the future. Hurrah?
mellowtigger: (penguin coder)
2013-02-18 09:04 pm
Entry tags:

resurrecting Clippy

None of those annoying animations were ever supposed to exist on Windows 7. There was such an outcry, though, that Microsoft created a hotfix that allowed support of their deprecated animation protocol known as Microsoft Agents. Doing a little more digging, I found sources for all of their original agents.

Read the comments prior to each section to find the urls you need to visit to download all of the relevant files. Each of them should produce a file of .ACS extension in the C:\WINDOWS\MSAGENT\CHARS\ directory.  This Powershell script will demonstrate all of them.



# Demonstration of Microsoft Agent controls.
# NOTE: These agents are deprecated.  They were never supposed to be on Win7.
# http://www.microsoft.com/products/msagent/main.aspx

# The primary Agent control can be downloaded for Windows 7 as a hotfix.
# It will install c:\windows\msagent\chars\merlin.acs, plus other supporting files.
# Download the hotfix from this url:
# http://support.microsoft.com/kb/969168

$MI6 = new-object -com agent.control.2
$MI6.connected = $true

$MI6.characters.load('Merlin','Merlin.acs')
$agent = $MI6.characters.character('Merlin')
$agent.Show()
$agent.think('Hello!  I''m over here!')
$agent.play('Wave')
$agent.moveto(800,600)
$agent.think('I and the other Microsoft Agents are here.')
$agent.Play('Pleased')
$agent.think('I am Merlin.')
$agent.think('Only 1 agent can be loaded at a time.')
$agent.hide()
while ($agent.visible) { sleep -Milliseconds 100 }
$MI6.characters.unload('Merlin')

# You can download the other official agents at these urls.
# Genie  @ http://www.microsoft.com/en-us/download/details.aspx?id=4957
# Robby  @ http://www.microsoft.com/en-us/download/details.aspx?id=12327
# Peedy  @ http://www.microsoft.com/en-us/download/details.aspx?id=17076
# Clippy @ http://www.microsoft.com/en-us/download/details.aspx?id=11215
# list of their animations @ http://msdn.microsoft.com/en-us/library/ms695821.aspx

$agentlist = @(('Genie','Genie.acs'),('Robby','Robby.acs'),('Peedy','Peedy.acs'),('Clippy','Clippit.acs'))
foreach($name in $agentlist.GetEnumerator()) {
    $MI6.Characters.load($name[0], $name[1])
    $agent = $MI6.characters.character($name[0])
    $agent.moveto(800,600)
    $agent.show()
    $agent.think('Hi, I''m '+$name[0]+'.')
    $agent.play('Wave')
    $agent.hide()
    while ($agent.visible) { sleep -Milliseconds 100 }
    $MI6.characters.unload($name[0])
}

# A Microsoft Office agent no longer downloadable from Microsoft (that I could find).
# Not all animations from above are available on this character.
# Rover @ http://www.tinmith.net/wayne/rover/index.htm

$MI6.Characters.load('Rover','rover.acs')
$agent = $MI6.characters.character('Rover')
$agent.moveto(800,600)
$agent.show()
$agent.think('Hi, I''m Rover.')
sleep -seconds 6
$agent.hide()
$MI6.characters.unload('Rover')

# Non-Microsoft agents that people have developed:
# http://www.msagentring.org/chars.aspx
# http://www.zero2000.com/microsoft-agent-characters.html



I call this script Clippy.ps1.  I loathed Clippy passionately.  Now, finally, he's my puppet.  Come to the dark side; feel the power(shell)!
mellowtigger: (penguin coder)
2013-02-17 01:43 am
Entry tags:

Firefox bookmarks into filesystem shortcuts

I wrote a Powershell script to convert my Firefox browser bookmarks into shortcut files on my hard drive. I was getting too large a collection, and saving new bookmarks was finally taking obviously more-than-zero time to process. From now on, I'll just drag shortcuts into folders for easier sorting and searching.

Powershell V3 includes new JSON commands, but they didn't work because my data store was too large. I had to use a 3-line command instead of a 1-line command to do the same thing, so it was an easy workaround. I retained the same directory structure and creation timestamp as in my bookmarks, too. I didn't bother with webpage icons, but the script could be upgraded to handle those too.



# Convert bookmarks from Firefox browser JSON backup to Windows 7 filesystem
# This script works in PowerShell v3.

# configure these variables as needed
$sourcefile = 'p:\www\bookmarks-2013-02-16.json'
$destfolder = 'p:\www\bookmarks\'
$foldertype = 'text/x-moz-place-container' # specific to Mozilla Firefox browser

# unix epoch time is milliseconds since 1970-01-01 00:00:00 GMT
# http://www.epochconverter.com/
$epoch  = [datetime]'1970-01-01'

# i found that a delay was necessary, otherwise errors cropped up occasionally
$delay = 10 # milliseconds

function CheckNodeType([object]$node, [string]$parent) {
    if ($node.type -eq $foldertype) {
        ProcessDirectory $node $parent
    } else {
        ProcessBookmark $node $parent
    }
}

function ProcessDirectory([object]$node, [string]$parent) {
    # remove bad characters from directory names
    $fixname = [system.text.regularexpressions.regex]::replace($node.title,"[^1-9a-zA-Z._]","")
    $dir = $parent + $fixname + '\'
    if (-not (test-path $dir)) {
        new-item -name $fixname -path $parent -itemtype 'directory'
        $offset = $node.dateAdded / 1000000
        $time   = [timezone]::CurrentTimeZone.ToLocalTime($epoch.AddSeconds($offset))
        sleep -Milliseconds $delay
        Set-ItemProperty -path $dir -name CreationTime -value $time
    }
    foreach($subnode in $node.children) {
        CheckNodeType $subnode $dir
    }
}

function ProcessBookmark([object]$node, [string]$parent) {
    # remove bad characters from filenames
    $fixname = [system.text.regularexpressions.regex]::replace($node.title,"[^1-9a-zA-Z. _]","")
    # truncate excessive filename lengths
    if ($fixname.Length -gt 200) {
        $fixname = $fixname.substring(0,200)
    }
    # process the good filename
    $file = $parent + $fixname + '.url'
    if (-not (test-path $file)) {
        $data = "[InternetShortcut]`nURL="+$node.uri
        new-item -name ($fixname + '.url') -path $parent -itemtype 'file' -value $data
        $offset = $node.dateAdded / 1000000
        $time   = [timezone]::CurrentTimeZone.ToLocalTime($epoch.AddSeconds($offset))
        sleep -Milliseconds $delay
        Set-ItemProperty -path $file -name CreationTime -value $time
        Set-ItemProperty -path $file -name LastWriteTime -value $time
    }
}

# read the JSON-formatted text string from the bookmark backup file
$json  = get-content $sourcefile

# the native ConvertFrom-Json command does not work on very large JSON stores, so use this method instead
[Reflection.Assembly]::LoadWithPartialName("System.Web.Extensions") | out-null
$jsser = New-Object System.Web.Script.Serialization.JavaScriptSerializer
$jsser.MaxJsonLength = $json.Length
$tree  = $jsser.DeserializeObject($json)

# process the object tree
$menu  = $tree.children | where { $_.title -eq 'Bookmarks Menu' }
foreach($node in $menu.children) {

    # ignore management folders
    if (($node.title -ne 'Recently Bookmarked') -and 
        ($node.title -ne 'Recent Tags') -and 
        ($node.title -ne 'JSTOR liberator')) {

        CheckNodeType $node $destfolder
    }
}


It took several more hours than I expected to get that script working, but it works well.  I like my bookmark folders.  I expect to do some manual work to create a better sorting nomenclature for them.
mellowtigger: (penguin coder)
2012-11-15 12:07 pm
Entry tags:

screen scraping with Internet Explorer

Powershell can be used with Internet Explorer to "screen scrape" websites. Launching these scripts through the task scheduler can be complicated to setup, especially if the destination url is a secure protocol website.  I wanted to avoid making the batch logon a local machine administrator, so I found the following security changes necessary to get the script to work properly.

1) Update the logon authority

1. The end user account should not be an administrator account, just a regular domain (or local) user.
2. Run the local security policy editor (c:\windows\system32\secpol.msc)
3. Navigate to Security Settings / Local Policies / Security Options
4. Click "Log on as a batch job", add the userid or security group
Preferably, use a domain security group designed specifically for this purpose.

2) Update the COM policy

1. Run the DCOM policy editor (c:\windows\system32\dcomcnfg.exe)
2. Navigate to Console Root / Component Services / Computers / My Computer
3. Right-click on My Computer, select Properties
4. Navigate to COM Security tab.
5. Click "Edit Default" for Launch and Activation Permissions
6. Add the userid or group and Allow both "Local Launch" and "Local Activation".

3) Disable user IE enhanced security

1. Run the server manager (Start / Computer / right-click, Manage)
2. Click to select the top entry "Server Manager (servername)"
3. Click link on right panel for "Configure IE ESC"
4. toggle OFF for Users

4) Update the IE settings

1. Run the Internet Options editor (Control Panel / Network and Internet / Internet Options)
2. Navigate to Security tab
3. Make sure you are viewing the "Internet" zone
4. UNcheck (if using IE8 or prior) "Enable Protected Mode".
5. View the "Trusted Sites" zone
6. UNcheck "Enable Protected Mode"
7. click "Sites" button
8. add your destination domain to the list

If you fail to perform these steps, then the powershell variable used to interact with IE will "break".  Either IE will fail to launch (possibly due to the choice of default webpage to load on initialization), or powershell will be unable to interact with the browser that launches.  OS security features will "break" the handle to the Document Object Model variable in IE, leaving empty values instead.
mellowtigger: (penguin coder)
2012-10-15 09:12 pm
Entry tags:

is this script interactive?

Can a powershell script determine if it's running in an interactive environment or not? I have a few methods that get close to the answer, but nothing offers satisfying certainty yet.

CodeExplanation
$host.namepowershell.exe yields 'ConsoleHost'
powershell_gui.exe yields 'Windows PowerShell ISE Host'
Each program, however, can operate in either batch or interactive conditions.

$process = gwmi win32_process -filter "ProcessID=$pid"
$batch = $process.commandline -match "-NonInteractive"

If someone programs the scheduled task with the "-NonInteractive" parameter, then the script can successfully read that parameter.  It's not the default, though.
$host.runspace.runspacestateinfo
$host.runspace.runspaceavailability
I always find the state info as 'Opened'. In gui conditions, availability has always been 'Busy'. In batch conditions, though, it is 'Available'.  Can I rely on this indicator?

I need to know, because I want to separate hardcoded userid/password combinations from my scripts.  I want to put them into the parameter of the scheduled task instead.  When testing those same scripts interactively, though, I want the script to prompt me for the data that it didn't receive through parameters.

Any script stuck at a user prompt is a Bad Thing [tm] when running in batch mode, since no one will ever answer it.  Using the '-NonInteractive' parameter is a good idea for all batch PowerShell scripts for this reason.  It seems to just suppress any command (or the partial execution of it) that would require user input.  I didn't even know this parameter existed until today.  It would have prevented an error in a production script that got stuck prompting for permission to delete a subfolder that I never expected to be there.  My script would have skipped that subfolder and gone on its merry way doing other things.

It still doesn't answer my question, though.  Is there a reliable way to tell if my script is running interactively or not?  Yes, it's a bit like asking which kind of virtual reality I live in, but sometimes it is important to know the distinction.  :)  Does anyone have the answer?

edit 2012.10.19: I forgot to mention previously, but someone on g+ pointed me to a webpage where contributors mentioned a few ways of dealing with this problem.  Among the solutions was one that works:  [environment]::userinteractive.
mellowtigger: (penguin coder)
2012-05-20 11:23 am
Entry tags:

recordsplit.ps1

Someone on Google Plus asked for a quick program. I couldn't format the text properly on G+, so I'm posting it here instead.

# Save this text using Notepad to a file named recordsplit.ps1
# Right-click the new recordsplit.ps1 file, then select "Run with PowerShell"
# If it doesn't run, you need to enable powershell scripts...
#      http://technet.microsoft.com/en-us/library/ee176949.aspx#EEAA

# This script separates a text file into other text files, based on the delimiter text.
# Change the first 3 variables as needed.
# If more than 999 records will be used, be sure to change the "D3" in the $newfile to an appropriate digit length ("D4")

$file='P:\ps\CraigFroehle.txt'
$delimiter='{{NEW_RECORD}}*'
$keepdelimiter=$true

$counter = 0
$fileitem = gci $file
foreach ($line in (get-content $file)) {
    if ($line -like $delimiter) {
        $counter += 1
        $newfile = $fileitem.directoryname + '\' + $fileitem.basename + '.' + ("{0:D3}" -f $counter) + $fileitem.extension
        set-content -path $newfile -value $null
    }
    if (($line -ne $null) -and ($line -ne '')) {
        if (($keepdelimiter -eq $true) -or ($line -ne $delimiter)) {
            add-content -path $newfile -value $line
        }
    }
}

Enjoy.  :)
mellowtigger: (penguin coder)
2012-04-09 10:59 am
Entry tags:

SMDR phone logging

phone operatorThe more I learn about phone system technology, the more I wonder how anything ever works (successfully).

Among the hodge-podge of responsibilities I have at my job, I also manage 6 PBX phone systems of the Mitel / Intertel CS-5000 family. Thanks to a useful hint that my boss found in a Perl script via Google, I was able to establish phone system logging using Powershell. My programming will save us over $10,000 that we would have been charged by our vendor. That's half my annual salary, so I've earned my keep for a while. :)

I've redacted some of the details from my code for security reasons, but I wanted to make the source code public so it's easier for the next person to find useful material through a web search engine.

# log-smdr.ps1
# This Powershell script connects to a Mitel/Intertel CS-5000 series phone system, 
# directs it to open its SMDR stream, then logs the results to a text file.
# Station Message Detail Recording (SMDR) is a system feature that provides 
# a detailed record of outgoing and incoming calls.
# This script is intended to be restarted at midnight each night.

# Turning on SMDR reporting is a two-step process in DbStudio Session Manager.
# 1) System / Maintenance / SMDR, set "SMDR Output Active" to yes
#    ("Output Port" should already be set to "NONE")
# 2) System / Sockets / SMDR, set "Enable" to yes
#    (System / Cabinet / Sockets / SMDR, in some older systems)

# http://technologyordie.com/mitel-5000-smdr-script (example perl script)
# http://www.codeproject.com/Articles/12893/TCP-IP-Chat-Application-Using-C

param ([string]$site='needparm', [string]$logdir='C:\script\log\', [boolean]$debug=$false)

# convert $site parameter into ip/port combination
switch ($site) {
    'S1' { $ip='192.168.0.1'; $port=4000 }
    'S2' { $ip='192.168.0.1'; $port=4000 }
    'S3' { $ip='192.168.0.1'; $port=4000 }
    default { $ip='192.168.0.1'; $port=4000 }
}

# find a new filename for logging this session
$scriptps1 = $MyInvocation.MyCommand.Name
$scriptname = $scriptps1.substring(0,($scriptps1.length - 4))
$datelog = get-date -f 'yyyyMMdd'
$version = 0
do {
    $version += 1
    $filelog = $logdir + $scriptname + '.' + $datelog + '.' + $site + '.' + $version + '.txt'
} while (test-path $filelog)
set-content $filelog $null
if ($debug) {write-host 'logfile:' $filelog}

# create a tcp socket and a destination
# fyi, the TcpClient object did not work with SMDR (perhaps some default values interfered?)
# so I took it down a layer to the Socket object which works nicely
$sockfam  = [System.Net.Sockets.AddressFamily]::InterNetwork
$socktype = [System.Net.Sockets.SocketType]::Stream
$sockpro  = [System.Net.Sockets.ProtocolType]::TCP
$socket   = New-Object System.Net.Sockets.Socket $sockfam, $socktype, $sockpro
$ipaddr   = [System.Net.IpAddress]::Parse($ip)
$ipdest    = New-Object System.Net.IPEndPoint $ipaddr, $port

# establish the connection
if ($debug) {write-host 'connecting:' $ip ':' $port}
$socket.Connect($ipdest)
if ($socket.Connected -eq $false) {
    if ($debug) {write-host 'connection: FAILED'}
    return
}
if ($debug) {write-host 'connection: OPEN'}

# send the SMDR command ASCII character string (no password)
#     2  	002	02	00000010	STX	Start of Text
#     0  	000	00	00000000	NUL	Null char
#     132	204	84	10000100	„	Double low-9 quotation mark
$cmd = @([byte]2,[byte]0,[byte]0,[byte]0,[byte]132,[byte]0)
$socket.Send($cmd)
if ($debug) {write-host 'connection: CMD SENT'}

# the socket's .Receive() will Block, causing the program to freeze until bytes are received,
# so set up a stream which provides the useful .DataAvailable property instead
# incoming records are always 86 bytes long
$bytes = new-object System.Byte[] 86
$stream = new-object System.Net.Sockets.NetworkStream($socket)

do {
    start-sleep -m 1000
    if ($stream.DataAvailable) {
        $socket.Receive($bytes) | out-null  # script hangs here until bytes are received
        $result = [System.Text.Encoding]::ASCII.GetString($bytes[0..84]) # ignore trailing CR/LF
        if ($debug) {write-host $result}
        add-content $filelog $result # append data to text log file
    }
    $datetest = get-date -f 'yyyyMMdd'
} while ($datetest -eq $datelog)

if ($debug) {write-host 'connection: CLOSING'}
$socket.close()
if ($debug) {write-host 'connection: CLOSED'}
$socket = $null
You turn on SMDR by using the DbStudio Session Manager software that comes with your PBX system. That application directory also contains \DiagMon\Diagmon.exe, a program that can also be used to monitor (but not log, that I can tell) SMDR output from your phone panels. I intend to use DiagMon with packet sniffing software to reverse engineer the protocol for handling password-enabled SMDR. It's really hard to come by programmer-focused PBX documentation. I even contacted Mitel, but they defer to local vendors who have a financial disincentive to provide such documentation. :(

Here's a redacted version of the debug output from this script:
connecting to 192.168.0.1 : 4000
connection: OPEN
6
connection: CMD SENT
Q   Station Message Detailed Recording                          17:01:24 04-07-2012
Q    TYP EXT#  TRUNK DIALED DIGITS                START ELAPSED   COST  ACCOUNT CODE
Q
R   NET 1234  P0001 94567                        17:11 00:03:34 $00.00
R   NET 1234  P0001 95678                        17:16 00:00:09 $00.00
R   NET 2345  P0001 96789                        17:21 00:00:14 $00.00
R   IN  3456  90001                              17:19 00:04:56 $00.00
My next project will take these .TXT log files, parse them, and store the data fields in a SQL Server table. Then I can do lots of fun stuff like calculate the average length of time that people are spending in our phone tree before they reach a live person, or determine if people are hanging up in frustration before reaching any person.
mellowtigger: (penguin coder)
2012-03-09 10:28 am
Entry tags:

literal interpretations

Good night, GracieI learned yesterday how to write SQL Injection attacks while learning how to code against them during data import. Vandalizing databases is surprisingly easy, as easy as writing email spam that pretends to be from anybody.

The mischief is similar to the old George Burns and Gracie Allen vaudeville act, where information that was intended as just a clause to a command is instead misinterpreted to be the actual command. The SQL Injection attack is constructed so as to cause a SQL database to misinterpret data in exactly the same way.

Here is the import data file that I constructed to cause the problem.

Name Addr
Trust Me .'); DROP TABLE TESTTABLE1; --

The single quote near the beginning is necessary so as to prematurely close the quotation that was intended to mark the end of a data field. The close parenthesis and semicolon finish out the proper syntax of the intended data import command. Next comes the new command that I want to "inject" as another valid SQL command. The double-dash at the end is also necessary, because it forces the database to ignore as a comment any additional text that comes afterwards, thereby masking the trailing characters from my original import command.

It really is that simple. Even xkcd made fun of the ease with which disastrous consequences can be implemented.

little Bobby Tables

Here is the Powershell code that I used to take advantage of the SQL Injection attack during data import.
# SETUP
$conn = new-object ('Data.SqlClient.SqlConnection') 'Server=DBSERVERNAME;Integrated Security=True'
$conn.open()
$cmd = new-object System.Data.SqlClient.SqlCommand
$cmd.CommandTimeout=30
$cmd.Connection = $conn

# SQL INJECTION
$input = import-csv g:\directory.path\sql1.injection.csv
foreach($row in $input) {
      $p1 = $row.name
      $p2 = $row.addr
      $sql = "INSERT INTO reporting.dbo.SampleTable (FullName, FullAddr) VALUES ('$p1','$p2');"
      $cmd.CommandText = $sql

      $out = $cmd.ExecuteNonQuery()
}

# CLEAN UP
$conn.close()
$conn = $null
And here is the code I used to replace the middle section so it is safe from SQL Injection.
# SAFE DATA IMPORT
$input = import-csv g:\directory.path\sql1.injection.csv
foreach($row in $input) {
      $p1 = $row.name
      $p2 = $row.addr
      $sql = 'INSERT INTO reporting.dbo.MasterID (FullName, FullAddr) VALUES (@p1,@p2);'
      $cmd.commandtext = $sql

      $p0=@{p1=$p1;p2=$p2}
      foreach($key in $p0.keys) {
            $cmd.parameters.addwithvalue("@$key",$p0[$key])
      }
      $out = $cmd.ExecuteNonQuery()
}
I used SQL Parameterization to separate the clauses (data fields) of my SQL statement from the actual SQL command itself. I used a powershell hash table to make the dictionary index for those data field values.

I tested it out, and it worked. The first coding sample caused the database to delete the test table. The second coding sample resisted the SQL Injection vandalism, and put the data values into the data fields as intended.
mellowtigger: (Default)
2008-04-17 07:49 am
Entry tags:

powershell

Take note: This may be the only good thing that I ever say about Microsoft products.

I like Windows PowerShell.

I downloaded it last night and played around with it a bit.  Instead of manipulating text strings like other shells, this one manipulates objects.  I suppose that various operating systems are just too different to come up with a common command set to make a cross-platform list of shell commands.  Still, I like it even if it is Windows-specific.