Tuesday, January 02, 2024
Wednesday, August 25, 2021
Whole Foods delivery persons scan people's driver's licenses into their mobile phone app, a wildly risky procedure
In a grocery delivery order yesterday from Whole Foods, I requested a bottle of red and white cooking wine. The delivery person rang my doorbell and requested ID because the order included "alcohol", though she said, "You are obviously over 18". I brought her my driver's license, handed it to her, and then noticed she was preparing to photograph my driver's license with her cell phone.
I snatched the driver's license back and attempted to explain to her that I did not consider it safe to allow her to take a photograph of my driver's license, which includes every piece of information needed for identity theft short of an actual social security number. Her broken English was not sufficient for us to communicate. She put me on a call via her phone with "customer support", a man whose Spanish-accented English was even worse than hers. This man insisted that I had to allow my ID to be photographed. "Just do it", he said. "It's required." After attempting to explain and getting the same rote response, I hung up on him.
After several minutes, the poor delivery person finally gave up and left. I looked around on my email order and online trying to find a customer support number for Whole Foods or even just for Amazon. The only number I could find sent me directly to the recent delivery person. In desperation, I phoned my local Whole Foods store and demanded to speak to the manager. After several minutes wait, I spoke with a man who claimed to be the store manager, but he said that his store is one of those Whole Foods store that has no voice in how Whole Foods deliveries are handled. However, he listened to my complaint and then gave me an actual phone number for support on Whole Foods delivery: 877-477-3527.
Upon calling this support number, I reached a help line person whose English was understandable. She first sent a security code to my mobile phone so that I could verify my identity--a sign of Amazon's usually reasonable security procedures. Then I explained the entire situation, naming my delivery person and requesting that she not receive any kind of discipline over my refusal to allow my license to be scanned. I expressed my disappointment that Amazon was taking the risk of scanning customer ID's in a phone app on many different mobile phones for its delivery people. I told her I understand that legally they need to verify that the order came from someone who is not a minor, but like in a liquor store, showing the delivery person an appropriate form of ID should be enough. The delivery person ought to be able just to note down which kind of ID was shown and that should be enough. After all, liquor stores don't scan our driver's licenses.
Next, she smugly assured me that scanned ID photos "are not stored locally on the device". And my response was, "How do you know that? How can you even begin to promise that? I won't allow my driver's license to be photographed by a stranger onto their phone. It makes me nervous enough to use a banking app when making check deposits, but at least, I have personally authenticated myself to the bank with end-to-end encryption." I continued to explain that we customers know absolutely nothing about the app used by Amazon drivers, and we have no reason whatsoever to assume it doesn't store those photos locally, at least temporarily. Also that apps frequently have been found to lie about data collection and steal and misuse customer information, as anyone would know who reads technical journals as I do. This practice, I argued, is also not compliant with the basic tenet of data collection, which is, collect no more personal information about people than is absolutely necessary. I went on and on. How many episodes have occurred, for example, where an app illegally accessed the camera or microphone of a device? News stories about these breaches of privacy occur almost daily. Given the slow pace of cell phone communications with the phone network, I feel certain that photos must indeed go temporarily to the hard drive of the device, where they risk not being deleted appropriately later.
In the end, she promised to record and escalate my complaint to her management. Whether she actually will do so or not, I do not know. I think it more likely that I'll be noted down as a difficult or problem customer. Hence, this blog. People should refuse to allow strangers to photograph sensitive materials on their cell phones. This is basic common sense in today's "bad internet" environment.
Saturday, December 19, 2020
An episode with the dreaded "Windows doesn't recognize the headphones" problem
Just over a year ago, I bought a Dell 7400 i7 laptop at my university and paid over $1800 for it, which included a two-year technical support contract. This is the most money I ever paid for a computer in my life. And as often happens, after about a year of use, the laptop developed a problem, in that the machine stopped recognizing the headphones. The audio still worked fine--out loud--but in 2020, this year of the pandemic when I'm living as a shut-in with a spouse trying to work in the next room, having no headphones became a real hardship.
I tried the usual things. I Googled, I looked in Dell tech support forums, and I pursued various possible "cures" carefully and diligently. This was terrible too, because I was using three monitors and a hub to interface with a scanner, an external drive, an external fan, and an external keyboard etc. And the problem is, one of the monitors also had an audio jack in it, and the monitor's audio jack actually worked. But that audio jack was located on the back of the monitor so it was impossible to reach it to plug in or out. Nor was this remote monitor jack good enough to satisfy web-based music collaboration software such as JamKazam which I was hoping to be able to use, but had been unable to use throughout the pandemic due to this stupid headphones-not-being-detected problem.
Weeks went by, then months, and all my efforts failed. Meanwhile, I had removed everything docked to the laptop, and all external audio drivers, in case they could have been the problem. And finally I was pissed off and desperate enough to telephone Dell Tech Support. Having purchased Dell laptops for over 20 years, I know what I'm talking about when I say having to telephone tech support is a crap shoot. Dell has always tried hard, but it is a large company, and its help lines are outsourced to overseas firms with a frankly mixed record. And Dell has, or used to have, multiple tech support systems, one for corporate customers, another for academic customers (me, this time), and yet another for home consumers.
Dell support seems to have improved, though. I had bought this machine through a university discount program, which gets even better tech support from Dell than normal people get, and the machine was still under warranty. And this time, I immediately got sent to the correct part of the support chain. There was a long wait online, then I got to speak to a person in the Phillippines, but she had really good English and furthermore was smart and empowered to get to the bottom of an issue--which was not always the case in past years.
So I made my report, and assured tech support that yes, I had tried multiple times in software to uninstall the audio devices in software, rebooted to let them reinstall themselves, and then rebooted twice more to install the very latest drivers from Dell. Yes, nothing was disabled. Yes, I had tried multiple headphones and I knew those worked elsewhere. After these long discussions, they eventually agreed with me that possibly the physical audio jack had simply failed, because audio jacks are dumb physical devices with little wire parts in them that can get bent from stress. And--guess what? The audio plug was not modular but was fused into the motherboard. Thus, the entire motherboard would need to be replaced. And I was warned that if that did not do the trick, then the next step would be to have to wipe the Windows operating system and start it over, because then the problem would definitely be proved to be software.
I really, really did not want to start my OS over; that would cause me to lose literally weeks of effort getting everything I need for my work reinstalled. And I had already been thoroughly inconvenienced by this problem.
Meantime, Dell tech support got a little confused, and it's left hand didn't know what the right hand was doing. One Dell manager emailed me that the motherboard would not be available until nearly 2 months in the future, and would I accept a replacement system instead? I responded yes, if that was the best they could do, I'd deal with the replacement rather than wait. But on the same day, I started getting texts and calls from another stream indicating that a tech repair person wanted to come to my house the next day, which was looking to be during a major snowstorm, and would I please indicate whether I'd be home and make sure no one in the household was sick. I answered, yes I would be home and no, no one was sick.
Two days later, as it happened (there was a snowstorm after all), a Dell tech support person did indeed show up at the house after reaching my phone to make an appointment. It was not a convenient time for me (I had to miss yoga class) but I wasn't going to delay it for any reason. I locked my spouse and cat into a bedroom, met the tech at the door wearing a mask, and he came masked into the house and labored busily over the laptop for about 30 minutes, after which the headphones still did not work. Nevertheless, he advised me to call tech support again "and try some things."
I felt depressed. Would I need to wipe Windows after all, and would even that fix the problem? But after a few hours of moping, I rallied and tried all the usual shenanigans once again--removed the audio devices in the control panel, rebooted, let them reinstall themselves. Still didn't work. Went to Dell's support site, downloaded the latest drivers and installed those, requiring two more reboots (one to remove old drivers, one to install new ones). And--it worked! My headphones are now detected when I plug them in--for the first time in maybe 6 months! There is a god.
And it makes me wonder why in the world something as dumb as an audio jack would not be designed to be modular. It should be plug and play. Just like a vacuum cleaner part that customers could change out themselves. But no, the entire bloody motherboard had to be changed, involving a house call by a technician. Any other year, Dell would probably have made me ship the computer in and get it back three days later via overnight shipping, but in this 2020 pandemic year, shipping is so buggered it would probably take three weeks to ship instead of days. So thank you, Dell, for having decent tech support even if they got a little confused about what was going on and two different people were emailing me about two entirely different repair strategies.
And thank the universe that I didn't have to wipe the operating system, and after spending only maybe 40 or 50 hours total on the problem (on my part, including the searches and failed attempts to troubleshoot before calling tech support) it does appear to be solved.
Bottom line is, based on all my experience, it is typical for a laptop to have something fail, and that can be anytime within the first year. So it is probably worth it to purchase a tech support contract. For this reason, I sometime advise friends to shop at a local Best Buy so that, when that failure does occur, they can run the laptop in to the Best Buy Geek Squad for repair. Dell is also, I think, still a decent vendor if you buy a support contract along with the laptop.
Wednesday, August 14, 2013
Dropbox and SendSpace: useful and reliable file services
Dropbox will store your files "in the cloud" (that is, on a server, or servers, somewhere on the internet). You access the files via a special "Dropbox" folder on your computer, which is created by the Dropbox installer. Dropbox will automatically copy any files you deposit in its folder up to its servers. Then, you'll be able to reach those same files on any other computer where you've also installed Dropbox using the same logon account. To get started on a given computer, you just download and install the Dropbox client, run it and logon to your Dropbox account in the client. The client program manages your Dropbox folder. It is very smart about syncronizing local and server files, and it gives a good visual indication of when the syncronizing is done.
Sendpace will allow you to transfer very large files. I generally zip, or compress, one or more files into a single huge file before uploading to the sendspace server. Sendspace is drop-dead simple to use. You don't need any coaching from me--just go to the site and follow instructions. When a file is too huge to move around by any other means, Sendspace can nearly always move a file between any two computers as long as they both have access to the internet. Sendspace's free service offers the sender a link to delete the file from its servers, but if the sender doesn't bother, the file will be deleted automatically after two weeks.
Paying for Sendspace will increase the maximum allowed file size from huge to humungous. It will also allow you to keep files up on their servers indefinitely (as long as your account is paid up).
To send a file to someone else, you do have to give Sendspace their email address; Sendspace then emails the person a link which they can use to download the file. As far as I can tell, Sendspace does not mine the emails and never spams any user or recipient of files. That the service has remained spam-free is unusual these days, and that makes it one of my favorite internet companies right now, and one of the few I can recommend whole-heartedly.
Good services deserve good publicity; hence this blog entry. Give them a try! It won't cost you anything to try them out. However, I don't recommend that you send any sensitive data using these (or any other "cloud-based") services. There is never any guarantee that a third party company won't look inside your data. Thankfully, though, the kinds of files I'm sending around are not likely to be desirable to anyone but me or my immediate co-workers, and they don't contain anyone's private information.
Friday, July 05, 2013
GoDaddy hassle
With SSI's breaking. |
Healthy page example. |
I immediately put in a call to GoDaddy technical support to find out what had happened and get some help figuring out how to get the site back to its former state of health. After navigating numerous computerized phone menus and waiting on hold for about 15 minutes, I finally reached a human being, who immediately put me on hold and then disconnected the call. This person did not call me back, so after a few more minutes, I put in a second call to GoDaddy support. Same drill: after about 15 minutes, I got a person, who didn't know anything and put me on hold while he "contacted the server group". After another 15 minutes, he returned to announce that I would have to fix the problem myself, as it was a scripting problem. OK, I enquired, how shall I fix the problem? My code hasn't changed. And in the meantime, I had verified by extensive web searching that GoDaddy's forums had no help page showing how server-side includes ought to work. Further, there were many entries in GoDaddy's forums within the past two weeks by other customers whose server-side includes had also stopped working. "Sorry", the tech support guy said, "it's a scripting problem and we don't touch your code. You'll have to fix it."
I was now waffling between disbelief and rage. After spending another hour trying every wild suggestion, and everything I've ever encounted to get server-side includes working on Linux, I "patched" the problem for the short term by eliminating the SSI's altogether in the important pages of the website, so that my site once again had graphics and styling.
Returning the next day, fresh and rested, I was able to get server-side includes working again by making this small change:
Bad directive: <!–#include virtual=”./shared/insert.txt” –>
Good directive: <!–#include file=”./shared/insert.txt” –>
Really? One word change fixed this? And GoDaddy tech support is too stupid to tell this to customers? Needless to say, I don't trust GoDaddy. I fully expect now that the pages will stop working at any moment due to some unknown change in their server configuration.
And, I will never, ever again use GoDaddy for any service. What arrogance these large corporations are capable of developing towards their own customer base. I'm still aghast. They could have kept my business so easily. Stay away from GoDaddy. The word "evil" comes to mind.
I will be moving all business from GoDaddy as soon as can be arranged.
Tuesday, November 27, 2012
Microsoft update kills web application via namespace collision
After investigating, I noticed a long, messy compile error that had not existed before, on a page in the cache which I had not created. I stared at the error and the page until I began to understand that perhaps there was a name collision on the word "Site". I had long used a master page called Site.master, so I changed it to Site1.master and repeatedly recompiled until all pages dependent on this master page had been changed to use the new name--after which the problem disappeared.
So answer me this. How come an update to something in .NET 4 breaks a web service running on .NET 3.5? And furthermore, how could anyone at Microsoft be dumb enough to suddenly purloin a common name such as "Site"? Probably many programmers around the world are cursing and going through the same discovery process as I write this. Bad Microsoft! Bad! Bad!
Monday, September 10, 2012
Undeletable Files on NTFS File System
I knew right away that I had introduced the problem while attempting to administer permissions on Pmwiki, a third-party application written in PHP that was never intended to run on a Microsoft web server. Permissions for the upload folder kept reverting so that uploading attachments to the wiki failed, and it was while wrestling with this periodically recurring conundrum that I shot myself in the foot by somehow removing my ownership of those three files. To get around this boondoggle, I had made an entire new upload folder (from a backup), and just renamed the old "stuck" upload folder.
Then, the next time I tried my handy .zip-it-all-up backup trick, I got an error. The error, of course, happened right at the end of a long .zip process, and the entire archive failed to be created. I now had no off-disk backup happening, all due to these three "stuck" files, which I could see in the folder, but which I could neither read, delete, nor could I change permission on them. How anyone could deal with such problems without Google, I cannot imagine. Thank the universe for search engines.
Within minutes of beginning my search, I found this support page on Microsoft's site. At the very bottom, I found what I needed, which was the "if all else fails" strategy, which Microsoft called "Combinations of Causes" (that gave me a chuckle). This strategy required me to use a utility called "subinacl.exe" which the support page cited as being part of the Resource Kit.
Googling some more, I soon found that the Resource Kit was a very expensive (as in $300) book with tools on a disk in the back. I wasn't going to buy it. Then it occurred to me just to search for "subinacl.exe", and thankfully, I found that Microsoft had made it available as a download. So I downloaded it. But idiot that I am, I failed to note where it had installed itself (somewhere obscure, I promise you). Had to uninstall it, and then reinstall, this time noting down the install location, which for those of you going through the same thing, I will state here was C:\Program Files\Windows Resource Kits\Tools\.
So then I took a deep breath, constructed the command line that I needed in Notepad, then opened a command window, browsed to the obscure install folder shown above, and carefully pasted my unwieldy command into the window, then (holding my breath), I hit return. A frightening amount of techno babble appeared as the command executed. After a few tries, I got it to succeed, though it still gave warnings. I had to do this four different times, once for each file and then for the containing folder. The model command line is:
subinacl /onlyfile "\\?\c:\path_to_problem_file" /setowner=domain\administrator /grant=domain\administrator=F
View this joyful output here:
Altogether, the failures of Pmwiki and Microsoft file permissions have indeed cost me hours of hassle over the five years while I have managed the server. This latest offense was just the crowning jewel. Managing file uploads on IIS is a challenge at any time, as the web server really is leery of anyone writing into a file zone that it serves. But this doggy PHP open source piece of software (Pmwiki) that was tested only on Linux is, in retrospect, hardly worth the effort. I haven't tried installing other wiki software yet (no time for the learning curve!) but surely I hope there might be another one that works better than this when running on a Windows server.
Microsoft's support page did do the trick. The problem shouldn't be possible anyway. The big Administrator account should always be able to delete a file--what were they thinking?--but at least they provided the necessary utility to dig myself out of the pit. Still, I'm not exactly feeling kindly towards Microsoft at this moment. Or towards Pmwiki.
Thursday, June 28, 2012
ERROR - The 'Microsoft.Jet.OLEDB.4.0' provider is not registered on the local machine.
Thanks to Google and other bloggers, I learned that Visual Studio 2010 Professional compiles, by default, with an option called "Any CPU". But after recompiling the application with the "x86" option (which is an alternative to "Any CPU"), the application began to work on either a 32-bit or 64-bit CPU. Huh?
Apparently, with the "Any CPU" option, .NET ran the application as 64-bit (because it happened to reside on a 64-bit machine at the time) and a mismatch occurred when it tried to read from the 32-bit Access database. By forcing the compiler to be "x86", I forced the program to run as a 32-bit process even though it resides on a 64-bit machine, and no mismatch occurred.
I did notice that the application starts a bit more slowly than it did on a 32-bit machine, but once running, it warms up and runs just fine.
Those compiler options are very poorly named. And now when Office upgrades, someday, to 64-bits, I'll probably have to recompile again. Thanks, Microsoft.
Saturday, December 03, 2011
How to write to the disk (make a writable folder) in ASP.NET
* the user creates an uploads folder and sends its file spec to the administrator
* the administrator opens the Internet Information Services (IIS) Manager, browses to that folder, and right clicks over it to change file permissions as follows for the NETWORK SERVICE account:
** remove Read and Execute permission
** add Write permission
Both permissions should be changed in one step, and that should be all that is necessary. But test the write and subsequent read carefully; if either does not work, delete the folder, create a new one, and start all over again.
If you are using a hosting service, there should be some special procedure to get the permissions changed. In my experience, changing the permissions for this purpose has spotty success and can lead to a drawn-out hassle. I suppose it's worth it to have a reasonably secure server.
Labels: ASP.NET, System administration, Web development, Windows
Tuesday, November 01, 2011
VS.NET 2010 fails to compile program created earlier in VS.NET 2008
Error 9 Task failed because "sgen.exe" was not found, or the correct Microsoft Windows SDK is not installed. The task is looking for "sgen.exe" in the "bin" subdirectory beneath the location specified in the InstallationFolder value of the registry key HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SDKs\Windows\v6.0A. You may be able to solve the problem by doing one of the following: 1) Install the Microsoft Windows SDK for Windows Server 2008 and .NET Framework 3.5. 2) Install Visual Studio 2008. 3) Manually set the above registry key to the correct location. 4) Pass the correct location into the "ToolPath" parameter of the task.
These suggestions were too bizarre even to consider, and so I Googled. Right away, I found this nice blog entry which helped me out, and just in case it were to go away, I'm duplicating the helpful information it contains below. So, courtesy of the "dukelupus" blog, everything after this paragraph is copied verbatim from that blog.
Changing the registry key will not help nor will adding C:\Program Files\Microsoft Visual Studio 8\SDK\v2.0\Bin\ to the path. I did not try other solutions...I used FileMon to check what Visual Studio is looking for – and it appears that it will always look for that file at C:\WINDOWS\Microsoft.NET\Framework\v3.5\, which does not contain sgen.exe.
Just copy sgen.exe from C:\Program Files\Microsoft Visual Studio 8\SDK\v2.0\Bin\ to C:\WINDOWS\Microsoft.NET\Framework\v3.5\ and everything will now compile just fine. Here, to make your life easier, copy command:
copy /y “C:\Program Files\Microsoft Visual Studio 8\SDK\v2.0\Bin\sgen.exe” “C:\WINDOWS\Microsoft.NET\Framework\v3.5\”
Good luck!
Labels: C# and .NET, Microsoft Visual Studio, System administration, Windows
Wednesday, August 24, 2011
SQL Server Management Studio and the "Saving changes is not permitted" error.
"Saving changes is not permitted. The changes you have made require the folloing tables to be dropped and re-created. You have either made changes to a table that can't be re-created or enabled the option Prevent saving changes that require the table to be re-created."
Here's how to solve this in Microsoft SQL Server Management Studio 2008 through 2014:
1) Go into the Tools...Options... menu
2) In the popup, on the left, expand "Designers" by clicking on the plus
3) In the "Table Options" shown to the right, MAKE SURE that "Prevent saving changes that require table re-creation" is NOT checked.
I've lost a few hours of my life to this nuisance. Hope this will help someone else out of the conundrum.
Labels: SQL Server, T-SQL
Friday, June 10, 2011
ODBC workaround (Access to SQL Server) for 64-bit Windows 7
Even though all DSN's appeared to be configured correctly on my colleague's brand new (64-bit) Windows 7 machine, and the ODBC connections passed their test, the actual database declined to connect to the server. Thanks to various discussion groups, we finally figured out that the graphical user interface accessible from the Administrative Tools applet in Control Panel actually brings up a 64-bit ODBC application, whereas we (for backwards compatibility) needed the strangely hidden 32-bit System DSN window. To run it, we had to browse to this path:
C:\Windows\SysWOW64\odbcad32.exe
Clicking on odbcad32.exe runs the 32-bit version of the ODBC connection setter upper. There, we re-created all the System DSN's, and finally the Access databases were happy.
By default, the Windows GUI is presenting a 64-bit ODBC connection setter upper (which I believe is in the C:\Windows\system32 path somewhere. Going manually to the first path and running the application, then adding the ODBC connections, makes it work.
In the meantime, 3 days of work were lost to this problem.
Labels: DSN, ODBC, SQL Server, System administration, Windows
Thursday, May 20, 2010
The challenge of printing a .NET textbox
facing this challenge.
First, a little rant. Before the .NET framework was released (~2000), Microsoft provided a handy .print method on the RichTextBox and TextBox controls, and all a programmer needed to do was call it. But in Microsoft's almighty wisdom (and trying to be just like Java), the simple and highly useful .print method was removed in .NET, and now you have to do all the following steps successfully. And note, it's just as difficult in Java--what were the language developers thinking? I imagine they were thinking to provide maximum flexibility, but why not provide a quick-and-dirty out for the rest of us?
In the example below, my form (called JobForm) is printing the contents of a TextBox control (called textBoxRight).
STEP 1: DECLARE VARIABLES
You need a bunch of special fields in your form code to keep track of printing information. To get started, just use this:
#region printing declarations private Font midlistRegular = new Font( "San Serif", (float)7.8, FontStyle.Regular, GraphicsUnit.Point); private Font midlistRegular1 = new Font( "San Serif", (float)7.6, FontStyle.Bold, GraphicsUnit.Point); private int printMargin = 1; private int lastPosition = 0; private int lastIndex = 0; ////// Set in Paint (of form) for use when printing /// private float screenResolutionX = 0; #endregion printing declarations
STEP 2: INSTANTIATE A PRINTDOCUMENT OBJECT AND CREATE ITS EVENT CODE
You need a special object that raises an event each time one page has been printed and causes the next page to be printed. It is a non-visible control and is called "PrintDocument" (in library System.Drawing.Printing).
In the Windows Designer, drag a non-visible "PrintDocument" control onto your form (System.Drawing.Printing.PrintDocument). It will be instantiated on your form as "printDocument1". Double-click the PrintDocument control on the form to create the "PrintPage" event and give it the following code (using your TextBox name instead off "textBoxRight"):
private void printDocument1_PrintPage(object sender, PrintPageEventArgs e) { try { e.HasMorePages = this.PrintOnePage( e.Graphics, this.textBoxRight, this.printDocument1, this.screenResolutionX); } catch (Exception ex) { MessageBox.Show(ex.Message); } }
STEP 3:
Now create the PrintOnePage() method needed by the above code. Although the display truncates this code, if you copy it using Control-C, all the code will be grabbed. Use the boilerplate code below unchanged (and I apologize that it's so ugly):
////// Goes through each line of the text box and prints it /// private bool PrintOnePage(Graphics g, TextBox txtSurface, PrintDocument printer, float screenResolution) { // Font textFont = txtSurface.Font; Font textFont = new Font("San serif", (float)10.0, FontStyle.Regular, GraphicsUnit.Point); // go line by line and draw each string int startIndex = this.lastIndex; int index = txtSurface.Text.IndexOf("\n", startIndex); int nextPosition = (int)this.lastPosition; // just use the default string format StringFormat sf = new StringFormat(); // sf.FormatFlags = StringFormatFlags.NoClip | (~StringFormatFlags.NoWrap ); // get the page height int lastPagePosition = (int)(((printer.DefaultPageSettings.PaperSize.Height / 100.0f) - 1.0f) * (float)screenResolution); // int resolution = printer.DefaultPageSettings.PrinterResolution.X; // use the screen resolution for measuring the page int resolution = (int)screenResolution; // calculate the maximum width in inches from the default paper size and the margin int maxwidth = (int)((printer.DefaultPageSettings.PaperSize.Width / 100.0f - this.printMargin * 2) * resolution); // get the margin in inches int printMarginInPixels = resolution * this.printMargin + 6; Rectangle rtLayout = new Rectangle(0, 0, 0, 0); int lineheight = 0; while (index != -1) { string nextLine = txtSurface.Text.Substring(startIndex, index - startIndex); lineheight = (int)(g.MeasureString(nextLine, textFont, maxwidth, sf).Height); rtLayout = new Rectangle(printMarginInPixels, nextPosition, maxwidth, lineheight); g.DrawString(nextLine, textFont, Brushes.Black, rtLayout, sf); nextPosition += (int)(lineheight + 3); startIndex = index + 1; index = txtSurface.Text.IndexOf("\n", startIndex); if (nextPosition > lastPagePosition) { this.lastPosition = (int)screenResolution; this.lastIndex = index; return true; // reached end of page } } // draw the last line string lastLine = txtSurface.Text.Substring(startIndex); lineheight = (int)(g.MeasureString(lastLine, textFont, maxwidth, sf).Height); rtLayout = new Rectangle(printMarginInPixels, nextPosition, maxwidth, lineheight); g.DrawString(lastLine, textFont, Brushes.Black, rtLayout, sf); this.lastPosition = (int)screenResolution; this.lastIndex = 0; return false; }
STEP 4: ADD CODE TO YOUR FORM'S PAINT EVENT
In Windows Designer, open your form in graphical view mode. Open the form's Properties Window and click the lightning bolt to see events. Double-click on the form's Paint event to create it, and paste the boiler-plate code from my Paint event below into your form's paint event (your event will have a different name, using your form's name, than mine does below):
private void JobForm_Paint(object sender, System.Windows.Forms.PaintEventArgs e) { // save the form height here this.screenResolutionX = e.Graphics.DpiX; // set the last position of the text box this.lastPosition = (int)this.screenResolutionX; }
STEP 5: ACTUALLY PRINT THE TEXTBOX
To actually print the contents of the TextBox, you'll need code like this in a print menu or button event:
PrintDialog printDialog1 = null; printDialog1 = new PrintDialog(); if (printDialog1.ShowDialog() == DialogResult.OK) { this.printDocument1.PrinterSettings = printDialog1.PrinterSettings; this.printDocument1.Print(); }
I haven't paid huge attention to all the above code, once I got it working. I snarfed much of it from various sources on the web (thank you, bloggers!). It could be enhanced or cleaned up a lot. Maybe this will help another programmer get it done.
Labels: C# and .NET, Programming
Thursday, March 04, 2010
Wrestling to sort the ASP.NET GridView
Umpteen useless web articles later, I resorted to the paper books stashed on my shelf at home. First stop was Murach's ASP.NET 2.0, which is alleged to be so good. But it held no love for me. Second stop was Dino Esposito's "Programming Microsoft ASP.NET 2.0: Core Reference"--and finally, I got the help I needed.
I'm blogging about this because a good book deserves real credit. Many mysteries were unraveled by the Esposito book, including that I needed to explicitly re-populate the GridView when its "Sorting" event fired. Esposito's directions were extremely explicit: use the "Sorting" event's GridViewArgEvents ("e" parameter) to find out the sort key, and write a special stored procedure that uses the sort key to ORDER the data differently. These last bits of information were the treasure that finally allowed me to get sorting to work.
I'm posting a copy of the rather odd-looking stored procedure that I ended up using below for your edification. The "@order_by" parameter names the column on which to sort, and the odd way of constructing the query from strings allows brackets to fit around any strange or keyword column names:
CREATE PROCEDURE [dbo].[lsp_NRSA_find_counts_and_person_by_naded] @naded_id int, @order_by nvarchar(12) AS BEGIN IF @order_by = '' BEGIN SET @order_by = 'slide' END EXEC ('SELECT * ' + 'FROM vw_NRSA_cemaats_by_count_and_person ' + 'WHERE naded = ' + @naded_id + ' ORDER BY [' + @order_by + ']') END
Labels: ASP.NET, Programming, Web development
Tuesday, November 10, 2009
Sql Server Management Studio 2008 logon persistence problem
This made it necessary to click to change the user and enter a new password about 80 times per day--all because this premium, allegedly very smart sophisticated software has decided never to forget a former entry--even though I have deleted the Registered Server and recreated and so forth. I'm frankly angry about this nuisance, which wouldn't matter if I didn't have to open and close the tool so many times per day.
If anyone has any concrete suggestion on how I can defeat this thing, please let me know. I see from the internet that I am not the only one having this frustration, but I have yet to find any blog entry or suggestion to alleviate the problem.
UPDATED with an answer in Aug. 2010:
For Sql Server Management Studio 2008 on Windows XP, to restart all your logins, delete the file:
or possibly:
----
For Sql Server Management Studio 2008 on Windows 7, to restart all your logins, delete the file:
Labels: SQL Server, System administration, Windows
Monday, September 28, 2009
Delete a maintenance plan after changing SQL Server 2008 name
I've seen a few posts on how to delete an older maintenance plan after you've changed the computer name for a default installation of SQL Server 2008, but none of the posts fully solved the problem. Below is what I had to do--with the caveat that you have to find your own id's:
USE msdb DELETE FROM dbo.sysmaintplan_log WHERE subplan_id = '36F8247F-6A1E-427A-AB7D-2F6D972E32C1' DELETE FROM dbo.sysmaintplan_subplans WHERE subplan_id = '36F8247F-6A1E-427A-AB7D-2F6D972E32C1' DELETE FROM dbo.sysjobs WHERE job_id = '3757937A-02DB-47A6-90DA-A64AE84D6E98' DELETE FROM dbo.sysmaintplan_plans WHERE id = 'C7C6EFAA-DA4D-4097-9F9F-FC3A7C0AF2DB'
Labels: SQL Server, System administration
Sunday, May 03, 2009
C# code to get schema of an Access table
OleDbConnection conn =
new OleDbConnection(
"Provider=Microsoft.ACE.OLEDB.12.0;Data Source=" +
"C:\\phycoaide\\phycoaide.mdb;Persist Security Info=False;");
// retrieving schema for a single table
OleDbCommand cmd = new OleDbCommand("taxa", conn);
cmd.CommandType = CommandType.TableDirect;
conn.Open();
OleDbDataReader reader =
cmd.ExecuteReader(CommandBehavior.SchemaOnly);
DataTable schemaTable = reader.GetSchemaTable();
reader.Close();
conn.Close();
LogFile.WriteLine(" ");
foreach (DataRow r in schemaTable.Rows)
{
LogFile.WriteLine(" ");
foreach (DataColumn c in schemaTable.Columns)
{
LogFile.WriteLine(c.ColumnName + ": " + r[c.ColumnName]);
}
}
MessageBox.Show("done");
The LogFile class creates a file in the application folder from which your program runs:
//
// No copyright; free for reuse by anyone
//
// Pat G. Palmer
// ppalmer AT harbormist.com
// 2009-05-04
// Opens a single-threaded log file to trace execution
namespace Ansp
{
using System;
using System.IO; // file readers and writers
using System.Windows.Forms; // Application object
///
/// Singleton that appends log entries to a text file
/// in the application folder. If the file grows
/// to be too large, it deletes itself and starts over.
/// The file is kept open until the application ends
/// and implements the "dispose" pattern in case things
/// do not end gracefully.
///
public class LogFile : IDisposable
{
private static int maxsize = 470000;
private static string fileSuffix = "_log.txt";
private static string fileSpecification;
private static StreamWriter filewriter;
private static LogFile instance;
private LogFile()
{
}
~LogFile()
{
this.Dispose(false);
}
public static void InitLogFile()
{
if (instance == null)
{
instance = new LogFile();
}
string stringMe = "InitLogFile: ";
try
{
if (Application.ProductName.Length == 0)
{
fileSpecification = Application.StartupPath + "\\" +
"Test" + fileSuffix;
}
else
{
fileSpecification = Application.StartupPath + "\\" +
Application.ProductName + fileSuffix;
}
// restart file if too big
if (File.Exists(fileSpecification))
{
FileInfo myFileInfo = new FileInfo(fileSpecification);
if (myFileInfo.Length > maxsize)
{
File.Delete(fileSpecification);
}
myFileInfo = null;
}
// restart file with appending
filewriter = new StreamWriter(
fileSpecification, true, System.Text.Encoding.UTF8);
// start log with standard info
WriteLine("\r\n---------------------------------------------");
string tempString = stringMe +
Application.ProductName + " " +
Application.ProductVersion +
"log opened at " +
DateTime.Now;
WriteLine(tempString);
WriteLine(stringMe + "username=" + SystemInformation.UserName);
WriteLine(stringMe + Application.StartupPath);
}
catch
{
}
}
public static void WriteLine(string myInputLine)
{
try
{
if (instance == null)
{
InitLogFile(); // first time only
}
if (myInputLine.Length != 0)
{
filewriter.WriteLine(myInputLine);
filewriter.Flush(); // update file
}
}
catch
{
}
}
public static void Close()
{
instance.Dispose();
}
///
/// Implement IDisposable.
/// Do not make this method virtual.
/// A derived class must not override this method.
///
public void Dispose()
{
this.Dispose(true);
//// Now, we call GC.SupressFinalize to take this object
//// off the finalization queue and prevent finalization
//// code for this object from executing a second time.
GC.SuppressFinalize(this);
}
private void Dispose(bool disposing)
{
if (disposing)
{
// no managed resources to clean up
}
if (instance != null)
{
if (filewriter != null)
{
try
{
filewriter.Flush();
filewriter.Close();
}
catch
{
}
filewriter = null;
} // end if filewriter not null
} // end if instance not null
}
} // end class LogFile()
} // end namespace
Labels: Microsoft Access, Programming
Thursday, December 27, 2007
Suppressing VS.NET Compiler Warnings (2005, 2008)
As a rule, I don't like suppressing compiler warnings, but there is a time for everything. My time came when I inherited a huge mass of ill-behaving C# code and began adding XML comments. I was immediately overwhelmed by hundreds of warnings that said "Missing XML comment for publicly visible type". I wanted to compile the comments that I had added without being nagged by the compiler for not having added the remaining 300 possible XML comments as well.
I knew that Visual Studio 2005 would let me suppress specific warnings in the project build properties. However, I didn't know the warning number that I needed to supply. Microsoft, in their great goodness, has suppressed showing of warning numbers--they only show the text. A few googles later, I knew that it was either 1591 or CS1591, but no one told me anywhere, in general, now to find the full list of warning numbers. I've wanted this list many a time in the past, so I set out to find out, once and for all.
Eventually, I found that I needed to start at the top-level C# Reference page in MSDN2 (for the appropriate version of VS.NET), then search on "compiler warning " + "warning text". So searching on "compiler warning missing XML comment" got me the precious warning number that I needed, which is CS1591. But then I had to psychically understand, of course, that the CS must be left off, and only the 1591 entered.
See my glorious build screen which finally suppressed the evil hundreds of unwanted warnings:
UPDATE in Oct 2008: Now that I am using Visual Studio 2008, I have learned that I can right-click over a warning in the Error List pane, and it will pop up documentation about the warning that includes its error level and number, and from that, I can derive the 4 digits to place in the suppress box of the project Build properties. It is not necessary to search on the Microsoft website. I don't know if this feature was present in Visual Studio 2005 (and I just didn't know it), or not.
Labels: C# and .NET, Visual Studio