Tech Blech

Tuesday, January 02, 2024

Switchbot blind tilts - a review



I wanted to get automatic blind opener/closer devices (so-called "tilts").   It's possible to buy fully automated Venetian blinds from the get-go, and if ready to buy all new blinds, that would be the way to go.  I already had fairly new wooden blinds, so I wanted to retro-fit my existing blinds with a device which would tilt them automatically in a programmable way.

When I did my research in early 2023, there were 3-4 companies that sold retro-fit tilters, but it seemed to me only two of them were likely to satisfy me.

One solution was very expensive, but people tended to be highly satisfied with the results afterwards.  I think the product was called Somfy Clever.  The cost would have been $230 per blind and one must take the blinds off the wall to replace the tilt mechanism with a small motor.  I think you also have to buy a $230 hub.  I don't know how they are powered.  There is a physical remote control device, but I don't know how "automated" that solution is in terms of being able to program the blinds to behave how you want automatically.  If one wants a perfectionist solution that is essentially invisible, and has the money for it, this solution is worth a long look.

I, however, didn't want to take my blinds off the wall, or spend so much either, because the cadillac solution would have cost me about $3000 for  11 windows, not to mention a complex installation. Instead, I chose to install a gadget hanging off the blind that physically turns the wand.  It is from a Chinese startup called Switchbot.  I've installed Switchbots on 7 of my 11 blinds and they work adequately, though the documentation was poor and I had to redo my first installations more than once until I had learned a lot from trial and error.  It is one of the cheaper possibilities, coming in at around $70 per blind plus buying a hub, also about $70.  I ended up buying two hubs, one for each end of the house, because one hub did not seem to reach all over the house.  When you open the app, the hubs also display the temperature and humidity in that room.  I have actually enjoyed having the humidity readings  from the hubs during hot, steamy summer weather. 

The Switchbot add-on devices allow me to automatically open/close around sunrise/sunset and tilt the south blinds during hot weather.  I can also open/close the blinds by talking to Alexa or manually.  But the installation is not easy to learn.  It required getting on a ladder and removing the cornice for each blind, then measuring things very carefully.  The instructions were minimal, and I made some mistakes and had to learn by trial and error.  They really need to improve the instructions.  There are a lot of videos on YouTube showing the installation, where things go perfectly, but none of them anticipated the kinds of issues I ran into, which were all able to be overcome but not without some extensive experimentation.

Switchbot responds to support emails and  were very helpful via email, but response time from them can be a day or more after the first trouble report. 

Each blind tilt has a rechargeable battery that needs to charge about 6 hours (from empty) to get full.  From a full charge, the tilt is supposed to last six months under "light usage".  Each Switchbot tilt come with a little solar panel intended to mount in the upper corner of the window.  I disliked the solar panels taking up some of the window space, and my windows are shaded so they weren't getting enough sunlight.  Thus, I bought a charger (the cable was included) for each tilt, and then  plugged each tilt into the nearest wall outlet.  Buying the chargers, and in some cases, the right size of an extension cord, added to the cost and complexity of the install.  Next, I discovered that the tilt has a light that is visible after dark and I didn't want that in the bedrooms, so I decided to put all the Switchbot chargers on smart switches so they are only powered on for six hours during daylight, accomplishing the same thing the solar panels would have.  Also in some cases, I have needed to install a cable channel (long, straight plastic tunnel) on the wall so that the extension cords didn't look tacky or touch the floors.  I used white gaffer's tape (strong but removable)  to  hold the cables in  place in the top of the blind and, sometimes, to tape the cables to the wall.
 
The power issues drove the cost of each blind up to closer to $100 per window.  

Next problem was, that some Switchbots periodically get confused about where the home position is (usually after a few weeks) and ask for recalibration.  Calibrating a Switchbot only takes a minute but requires one to have a smart phone with Bluetooth enabled, and then turn the blinds to fully closed down, fully open, fully closed up, and again back to fully open.

In the year since I got them, the Switchbots asked me to upgrade the firmware only once.  I had to do this individually for each Switchbot, but it went without a hitch as long as was able to connect via Bluetooth to the Switchbot Hub nearest that blind.

Lastly, there is the color issue.  The Switchbot tilt gadgets only come in white, whereas my blinds are brown, so the look is not perfect.  If your blinds are a light color, this won't be an issue. 

All in all, once I got them all installed and tuned correctly, I love having them very much despite their imperfections.  Since I work from home, I'm there 24x7, and automating the blinds means one less chore morning and evening.  Also, it helps me wake up mornings to have the blinds open automatically, letting the beautiful morning light stream in.

Here is the charger that I ordered for each blind tilt (they come in a 3-pack): https://www.amazon.com/gp/product/B079R9N7WS/

Here is white gaffer's tape, useful anytime you need to tape down a cord somewhere but may want to remove the tape later (it comes in various colors too): https://www.amazon.com/gp/product/B00LMNYFHI/

It may be obvious, but the house needs to have wifi working first.

Wednesday, August 25, 2021

Whole Foods delivery persons scan people's driver's licenses into their mobile phone app, a wildly risky procedure

In a grocery delivery order yesterday from Whole Foods, I requested a bottle of red and white cooking wine.  The delivery person rang my doorbell and requested ID because the order included "alcohol", though she said, "You are obviously over 18".  I brought her my driver's license, handed it to her, and then noticed she was preparing to photograph my driver's license with her cell phone.

I snatched the driver's license back and attempted to explain to her that I did not consider it safe to allow her to take a photograph of my driver's license, which includes every piece of information needed for identity theft short of an actual social security number.  Her broken English was not sufficient for us to communicate.  She put me on a call via her phone with "customer support", a man whose Spanish-accented English was even worse than hers.  This man insisted that I had to allow my ID to be photographed.  "Just do it", he said.  "It's required."  After attempting to explain and getting the same rote response, I hung up on him.

After several minutes, the poor delivery person finally gave up and left.  I looked around on my email order and online trying to find a customer support number for Whole Foods or even just for Amazon.  The only number I could find sent me directly to the recent delivery person.  In desperation, I phoned my local Whole Foods store and demanded to speak to the manager.  After several minutes wait, I spoke with a man who claimed to be the store manager, but he said that his store is one of those Whole Foods store that has no voice in how Whole Foods deliveries are handled.  However, he listened to my complaint and then gave me an actual phone number for support on Whole Foods delivery: 877-477-3527.

Upon calling this support number, I reached a help line person whose English was understandable.  She first sent a security code to my mobile phone so that I could verify my identity--a sign of Amazon's usually reasonable security procedures.  Then I explained the entire situation, naming my delivery person and requesting that she not receive any kind of discipline over my refusal to allow my license to be scanned.  I expressed my disappointment that Amazon was taking the risk of scanning customer ID's in a phone app on many different mobile phones for its delivery people.  I told her I understand that legally they need to verify that the order came from someone who is not a minor, but like in a liquor store, showing the delivery person an appropriate form of ID should be enough.  The delivery person ought to be able just to note down which kind of ID was shown and that should be enough.  After all, liquor stores don't scan our driver's licenses.

Next, she smugly assured me that scanned ID photos "are not stored locally on the device".  And my response was, "How do you know that?  How can you even begin to promise that?  I won't allow my driver's license to be photographed by a stranger onto their phone.  It makes me nervous enough to use a banking app when making check deposits, but at least, I have personally authenticated myself to the bank with end-to-end encryption."  I continued to explain that we customers know absolutely nothing about the app used by Amazon drivers, and we have no reason whatsoever to assume it doesn't store those photos locally, at least temporarily.  Also that apps frequently have been found to lie about data collection and steal and misuse customer information, as anyone would know who reads technical journals as I do.  This practice, I argued, is also not compliant with the basic tenet of data collection, which is, collect no more personal information about people than is absolutely necessary.  I went on and on.  How many episodes have occurred, for example, where an app illegally accessed the camera or microphone of a device?  News stories about these breaches of privacy occur almost daily.  Given the slow pace of cell phone communications with the phone network, I feel certain that photos must indeed go temporarily to the hard drive of the device, where they risk not being deleted appropriately later.

In the end, she promised to record and escalate my complaint to her management.  Whether she actually will do so or not, I do not know.  I think it more likely that I'll be noted down as a difficult or problem customer.  Hence, this blog.  People should refuse to allow strangers to photograph sensitive materials on their cell phones.  This is basic common sense in today's "bad internet" environment.

Saturday, December 19, 2020

An episode with the dreaded "Windows doesn't recognize the headphones" problem

Just over a year ago, I bought a Dell 7400 i7 laptop at my university and paid over $1800 for it, which included a two-year technical support contract.  This is the most money I ever paid for a computer in my life.  And as often happens, after about a year of use, the laptop developed a problem, in that the machine stopped recognizing the headphones.  The audio still worked fine--out loud--but in 2020, this year of the pandemic when I'm living as a shut-in with a spouse trying to work in the next room, having no headphones became a real hardship.

I tried the usual things.  I Googled, I looked in Dell tech support forums, and I pursued various possible "cures" carefully and diligently.  This was terrible too, because I was using three monitors and a hub to interface with a scanner, an external drive, an external fan, and an external keyboard etc.  And the problem is, one of the monitors also had an audio jack in it, and the monitor's audio jack actually worked.  But that audio jack was located on the back of the monitor so it was impossible to reach it to plug in or out.  Nor was this remote monitor jack good enough to satisfy web-based music collaboration software such as JamKazam which I was hoping to be able to use, but had been unable to use throughout the pandemic due to this stupid headphones-not-being-detected problem.

Weeks went by, then months, and all my efforts failed.  Meanwhile, I had removed everything docked to the laptop, and all external audio drivers, in case they could have been the problem.  And finally I was pissed off and desperate enough to telephone Dell Tech Support.  Having purchased Dell laptops for over 20 years, I know what I'm talking about when I say having to telephone tech support is a crap shoot.  Dell has always tried hard, but it is a large company, and its help lines are outsourced to overseas firms with a frankly mixed record.  And Dell has, or used to have, multiple tech support systems, one for corporate customers, another for academic customers (me, this time), and yet another for home consumers.

Dell support seems to have improved, though.  I had bought this machine through a university discount program, which gets even better tech support from Dell than normal people get, and the machine was still under warranty.  And this time, I immediately got sent to the correct part of the support chain. There was a long wait online, then I got to speak to a person in the Phillippines, but she had really good English and furthermore was smart and empowered to get to the bottom of an issue--which was not always the case in past years.  

So I made my report, and assured tech support that yes, I had tried multiple times in software to uninstall the audio devices in software, rebooted to let them reinstall themselves, and then rebooted twice more to install the very latest drivers from Dell.  Yes, nothing was disabled.  Yes, I had tried multiple headphones and I knew those worked elsewhere.  After these long discussions, they eventually agreed with me that possibly the physical audio jack had simply failed, because audio jacks are dumb physical devices with little wire parts in them that can get bent from stress.  And--guess what?  The audio plug was not modular but was fused into the motherboard. Thus, the entire motherboard would need to be replaced.  And I was warned that if that did not do the trick, then the next step would be to have to wipe the Windows operating system and start it over, because then the problem would definitely be proved to be software.

I really, really did not want to start my OS over; that would cause me to lose literally weeks of effort getting everything I need for my work reinstalled.  And I had already been thoroughly inconvenienced by this problem.

Meantime, Dell tech support got a little confused, and it's left hand didn't know what the right hand was doing.  One Dell manager emailed me that the motherboard would not be available until nearly 2 months in the future, and would I accept a replacement system instead?  I responded yes, if that was the best they could do, I'd deal with the replacement rather than wait.  But on the same day, I started getting texts and calls from another stream indicating that a tech repair person wanted to come to my house the next day, which was looking to be during a major snowstorm, and would I please indicate whether I'd be home and make sure no one in the household was sick.  I answered, yes I would be home and no, no one was sick.

Two days later, as it happened (there was a snowstorm after all), a Dell tech support person did indeed show up at the house after reaching my phone to make an appointment.  It was not a convenient time for me (I had to miss yoga class) but I wasn't going to delay it for any reason.  I locked my spouse and cat into a bedroom, met the tech at the door wearing a mask, and he came masked into the house and labored busily over the laptop for about 30 minutes, after which the headphones still did not work.  Nevertheless, he advised me to call tech support again "and try some things."

I felt depressed.  Would I need to wipe Windows after all, and would even that fix the problem?  But after a few hours of moping, I rallied and tried all the usual shenanigans once again--removed the audio devices in the control panel, rebooted, let them reinstall themselves.  Still didn't work.  Went to Dell's support site, downloaded the latest drivers and installed those, requiring two more reboots (one to remove old drivers, one to install new ones). And--it worked!  My headphones are now detected when I plug them in--for the first time in maybe 6 months!  There is a god.

And it makes me wonder why in the world something as dumb as an audio jack would not be designed to be modular.  It should be plug and play.  Just like a vacuum cleaner part that customers could change out themselves.  But no, the entire bloody motherboard had to be changed, involving a house call by a technician.  Any other year, Dell would probably have made me ship the computer in and get it back three days later via overnight shipping, but in this 2020 pandemic year, shipping is so buggered it would probably take three weeks to ship instead of days.  So thank you, Dell, for having decent tech support even if they got a little confused about what was going on and two different people were emailing me about two entirely different repair strategies.

And thank the universe that I didn't have to wipe the operating system, and after spending only maybe 40 or 50 hours total on the problem (on my part, including the searches and failed attempts to troubleshoot before calling tech support) it does appear to be solved.

Bottom line is, based on all my experience, it is typical for a laptop to have something fail, and that can be anytime within the first year.  So it is probably worth it to purchase a tech support contract.  For this reason, I sometime advise friends to shop at a local Best Buy so that, when that failure does occur, they can run the laptop in to the Best Buy Geek Squad for repair.  Dell is also, I think, still a decent vendor if you buy a support contract along with the laptop.


Wednesday, August 14, 2013

Dropbox and SendSpace: useful and reliable file services

There are so many ways to move files around, and to back files up, that it can make one's head spin.  I work across several different computers in various locations.  I've ended up relying heavily on both the Dropbox and Sendspace file services.  Both are free for a limited file size and bandwidth, and both are worth paying for if you need to transfer or backup a lot of files.  I've now been a paying customer of both services for a couple of years, and I've found them both to be reliable and easy to use.  That said, they are not identical.  Dropbox is useful for backups and sharing regularly-used files across my many different work computers.  Sendspace is useful for transfering very large files among computers, either mine or belonging to other people.

Dropbox will store your files "in the cloud" (that is, on a server, or servers, somewhere on the internet).  You access the files via a special "Dropbox" folder on your computer, which is created by the Dropbox installer.  Dropbox will automatically copy any files you deposit in its folder up to its servers.  Then, you'll be able to reach those same files on any other computer where you've also installed Dropbox using the same logon account.  To get started on a given computer, you just download and install the Dropbox client, run it and logon to your Dropbox account in the client.  The client program manages your Dropbox folder.  It is very smart about syncronizing local and server files, and it gives a good visual indication of when the syncronizing is done.

Sendpace will allow you to transfer very large files.  I generally zip, or compress, one or more files into a single huge file before uploading to the sendspace server.  Sendspace is drop-dead simple to use.  You don't need any coaching from me--just go to the site and follow instructions.  When a file is too huge to move around by any other means, Sendspace can nearly always move a file between any two computers as long as they both have access to the internet. Sendspace's free service offers the sender a link to delete the file from its servers, but if the sender doesn't bother, the file will be deleted automatically after two weeks.

Paying for Sendspace will increase the maximum allowed file size from huge to humungous.  It will also allow you to keep files up on their servers indefinitely (as long as your account is paid up).

To send a file to someone else, you do have to give Sendspace their email address; Sendspace then emails the person a link which they can use to download the file.  As far as I can tell, Sendspace does not mine the emails and never spams any user or recipient of files.  That the service has remained spam-free is unusual these days, and that makes it one of my favorite internet companies right now, and one of the few I can recommend whole-heartedly.

Good services deserve good publicity; hence this blog entry.  Give them a try!  It won't cost you anything to try them out.  However, I don't recommend that you send any sensitive data using these (or any other "cloud-based") services.  There is never any guarantee that a third party company won't look inside your data.  Thankfully, though, the kinds of files I'm sending around are not likely to be desirable to anyone but me or my immediate co-workers, and they don't contain anyone's private information.

Friday, July 05, 2013

GoDaddy hassle

With SSI's breaking.
Healthy page example.
Yesterday, I received a trouble report about a Linux shared-host website which resides on GoDaddy servers.  To my horror, the site now looked like the page shown on the right, instead of the page shown below it.  It was pretty clear that server-side includes (SSI's) had stopped working, and since I had not updated the site in a couple of weeks and all had  been working well up until yesterday, it was also clearly because of something that GoDaddy had done to the server.   If nothing else, their server logbook ought to show what had been done, and so a developer like myself should be able to figure out how to adapt.  Bad enough that no advance notice had been given.

I immediately put in a call to GoDaddy technical support to find out what had happened and get some help figuring out how to get the site back to its former state of health.  After navigating numerous computerized phone menus and waiting on hold for about 15 minutes, I finally reached a human being, who immediately put me on hold and then disconnected the call.  This person did not call me back, so after a few more minutes, I put in a second call to GoDaddy support.  Same drill: after about 15 minutes, I got a person, who didn't know anything and put me on hold while he "contacted the server group".  After another 15 minutes, he returned to announce that I would have to fix the problem myself, as it was a scripting problem.  OK, I enquired, how shall I fix the problem?  My code hasn't changed.  And in the meantime, I had verified by extensive web searching that GoDaddy's forums had no help page showing how server-side includes ought to work.  Further, there were many entries in GoDaddy's forums within the past two weeks by other customers whose server-side includes had also stopped working.  "Sorry", the tech support guy said, "it's a scripting problem and we don't touch your code.  You'll have to fix it."

I was now waffling between disbelief and rage.  After spending another hour trying every wild suggestion, and everything I've ever encounted to get server-side includes working on Linux, I "patched" the problem for the short term by eliminating the SSI's altogether in the important pages of the website, so that my site once again had graphics and styling.

Returning the next day, fresh and rested, I was able to get server-side includes working again by making this small change:


    Bad directive:     <!–#include virtual=”./shared/insert.txt” –>

    Good directive:   <!–#include file=”./shared/insert.txt” –>

Really?  One word change fixed this?  And GoDaddy tech support is too stupid to tell this to customers?  Needless to say, I don't trust GoDaddy.  I fully expect now that the pages will stop working at any moment due to some unknown change in their server configuration.

And, I will never, ever again use GoDaddy for any service.  What arrogance these large corporations are capable of developing towards their own customer base.  I'm still aghast.  They could have kept my business so easily.  Stay away from GoDaddy.  The word "evil" comes to mind.

I will be moving all business from GoDaddy as soon as can be arranged.

Tuesday, November 27, 2012

Microsoft update kills web application via namespace collision

In my work at the Academy of Natural Sciences of Drexel University, I administer a database server and web server for the Phycology Section (on which runs a bunch of REST web services on .NET 3.5) .  Today after downloading a number of Windows updates on the web server, one of the web services (here's an example) was broken.  The updates I had downloaded are shown here:
After investigating, I noticed a long, messy compile error that had not existed before, on a page in the cache which I had not created.  I stared at the error and the page until I began to understand that perhaps there was a name collision on the word "Site".  I had long used a master page called Site.master, so I changed it to Site1.master and repeatedly recompiled until all pages dependent on this master page had been changed to use the new name--after which the problem disappeared.

So answer me this.  How come an update to something in .NET 4 breaks a web service running on .NET 3.5?   And furthermore, how could anyone at Microsoft be dumb enough to suddenly purloin a common name such as "Site"?  Probably many programmers around the world are cursing and going through the same discovery process as I write this.  Bad Microsoft!  Bad!  Bad!

Monday, September 10, 2012

Undeletable Files on NTFS File System

Maintaining a web server on Microsoft Windows Server 2008 has mostly been straightforward and not very time-consuming.  But recently I was confronted with a small but extremely annoying hassle, having to do with 3 files within the web server's area that could not be read, could not be deleted, and could not even be included within a .zip file.  It was this last issue that first got my attention; I had been accustomed to .zip up the entire wwwroot area from time to time to ship it off-disk as a backup.  This began to fail.

I knew right away that I had introduced the problem while attempting to administer permissions on Pmwiki, a third-party application written in PHP that was never intended to run on a Microsoft web server.  Permissions for the upload folder kept reverting so that uploading attachments to the wiki failed, and it was while wrestling with this periodically recurring conundrum that I shot myself in the foot by somehow removing my ownership of those three files.  To get around this boondoggle, I had made an entire new upload folder (from a backup), and just renamed the old "stuck" upload folder.

Then, the next time I tried my handy .zip-it-all-up backup trick, I got an error.  The error, of course, happened right at the end of a long .zip process, and the entire archive failed to be created.  I now had no off-disk backup happening, all due to these three "stuck" files, which I could see in the folder, but which I could neither read, delete, nor could I change permission on them.  How anyone could deal with such problems without Google, I cannot imagine.  Thank the universe for search engines.

Within minutes of beginning my search, I found this support page on Microsoft's site.  At the very bottom, I found what I needed, which was the "if all else fails" strategy, which Microsoft called "Combinations of Causes" (that gave me a chuckle).  This strategy required me to use a utility called "subinacl.exe" which the support page cited as being part of the Resource Kit.

Googling some more, I soon found that the Resource Kit was a very expensive (as in $300) book with tools on a disk in the back.  I wasn't going to buy it.  Then it occurred to me just to search for "subinacl.exe", and thankfully, I found that Microsoft had made it available as a download.  So I downloaded it.  But idiot that I am, I failed to note where it had installed itself (somewhere obscure, I promise you).  Had to uninstall it, and then reinstall, this time noting down the install location, which for those of you going through the same thing, I will state here was C:\Program Files\Windows Resource Kits\Tools\.

So then I took a deep breath, constructed the command line that I needed in Notepad, then opened a command window, browsed to the obscure install folder shown above, and carefully pasted my unwieldy command into the window, then (holding my breath), I hit return.  A frightening amount of techno babble appeared as the command executed.  After a few tries, I got it to succeed, though it still gave warnings.  I had to do this four different times, once for each file and then for the containing folder.  The model command line is:


subinacl /onlyfile "\\?\c:\path_to_problem_file" /setowner=domain\administrator /grant=domain\administrator=F

View this joyful output here:
command window

Altogether, the failures of Pmwiki and Microsoft file permissions have indeed cost me hours of hassle over the five years while I have managed the server.  This latest offense was just the crowning jewel.  Managing file uploads on IIS is a challenge at any time, as the web server really is leery of anyone writing into a file zone that it serves.  But this doggy PHP open source piece of software (Pmwiki) that was tested only on Linux is, in retrospect, hardly worth the effort.  I haven't tried installing other wiki software yet (no time for the learning curve!) but surely I hope there might be another one that works better than this when running on a Windows server.

Microsoft's support page did do the trick.  The problem shouldn't be possible anyway.  The big Administrator account should always be able to delete a file--what were they thinking?--but at least they provided the necessary utility to dig myself out of the pit. Still, I'm not exactly feeling kindly towards Microsoft at this moment.  Or towards Pmwiki.

Thursday, June 28, 2012

ERROR - The 'Microsoft.Jet.OLEDB.4.0' provider is not registered on the local machine.

I support a .NET application which reads a version table in its background database, which is Microsoft Access.  If the version is wrong, the application refuses to run.  Recently, one of the application users upgraded to a 64-bit machine, and the application began reporting that the backend database was "the wrong version", even though it was in fact the same database that had worked fine on a 32-bit machine.  After some debugging, I unearthed the following error message (which was being swallowed): ERROR - The 'Microsoft.Jet.OLEDB.4.0' provider is not registered on the local machine.

Thanks to Google and other bloggers, I learned that Visual Studio 2010 Professional compiles, by default, with an option called "Any CPU".  But after recompiling the application with the "x86" option (which is an alternative to "Any CPU"), the application began to work on either a 32-bit or 64-bit CPU.  Huh? 

Apparently, with the "Any CPU" option, .NET ran the application as 64-bit (because it happened to reside on a 64-bit machine at the time) and a mismatch occurred when it tried to read from the 32-bit Access database.   By forcing the compiler to be "x86", I forced the program to run as a 32-bit process even though it resides on a 64-bit machine, and no mismatch occurred.  

I did notice that the application starts a bit more slowly than it did on a 32-bit machine, but once running, it warms up and runs just fine.

Those compiler options are very poorly named.  And now when Office upgrades, someday, to 64-bits, I'll probably have to recompile again.  Thanks, Microsoft.


Saturday, December 03, 2011

How to write to the disk (make a writable folder) in ASP.NET

To do file uploads, some administration is always necessary on the server side. ASP.NET tries very hard to prevent users from writing anywhere on the server, so we have to take special steps if file upload is required. In particular, the IIS7 server will not let you both Execute and Write into the same folder. For the record, here are the steps for IIS7 on Windows Server 2008 (assuming you are system administrator on the server):

* the user creates an uploads folder and sends its file spec to the administrator
* the administrator opens the Internet Information Services (IIS) Manager, browses to that folder, and right clicks over it to change file permissions as follows for the NETWORK SERVICE account:
** remove Read and Execute permission
** add Write permission

Both permissions should be changed in one step, and that should be all that is necessary. But test the write and subsequent read carefully; if either does not work, delete the folder, create a new one, and start all over again.

If you are using a hosting service, there should be some special procedure to get the permissions changed. In my experience, changing the permissions for this purpose has spotty success and can lead to a drawn-out hassle. I suppose it's worth it to have a reasonably secure server.

Labels: , , ,

Tuesday, November 01, 2011

VS.NET 2010 fails to compile program created earlier in VS.NET 2008

Trying to recompile a program in Visual Studio 2010, which was originally created using Visual Studio 2008 (which is still on my PC), I got this baffling message:

Error 9 Task failed because "sgen.exe" was not found, or the correct Microsoft Windows SDK is not installed. The task is looking for "sgen.exe" in the "bin" subdirectory beneath the location specified in the InstallationFolder value of the registry key HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SDKs\Windows\v6.0A. You may be able to solve the problem by doing one of the following: 1) Install the Microsoft Windows SDK for Windows Server 2008 and .NET Framework 3.5. 2) Install Visual Studio 2008. 3) Manually set the above registry key to the correct location. 4) Pass the correct location into the "ToolPath" parameter of the task.

These suggestions were too bizarre even to consider, and so I Googled. Right away, I found this nice blog entry which helped me out, and just in case it were to go away, I'm duplicating the helpful information it contains below. So, courtesy of the "dukelupus" blog, everything after this paragraph is copied verbatim from that blog.

Changing the registry key will not help nor will adding C:\Program Files\Microsoft Visual Studio 8\SDK\v2.0\Bin\ to the path. I did not try other solutions...I used FileMon to check what Visual Studio is looking for – and it appears that it will always look for that file at C:\WINDOWS\Microsoft.NET\Framework\v3.5\, which does not contain sgen.exe.

Just copy sgen.exe from C:\Program Files\Microsoft Visual Studio 8\SDK\v2.0\Bin\ to C:\WINDOWS\Microsoft.NET\Framework\v3.5\ and everything will now compile just fine. Here, to make your life easier, copy command:

copy /y “C:\Program Files\Microsoft Visual Studio 8\SDK\v2.0\Bin\sgen.exe” “C:\WINDOWS\Microsoft.NET\Framework\v3.5\”

Good luck!

Labels: , , ,

Wednesday, August 24, 2011

SQL Server Management Studio and the "Saving changes is not permitted" error.

Sometimes after a new install of Microsoft SQL Server Management Studio, you may get a default setting that prevents the user from changing the design of tables in the visual designer mode, which can be extremely frustrating, as it is not easy to figure out how to turn the checking off that is preventing table redesign. The error message will be announced in an annoying popup whose text is uncopyable:

"Saving changes is not permitted. The changes you have made require the folloing tables to be dropped and re-created. You have either made changes to a table that can't be re-created or enabled the option Prevent saving changes that require the table to be re-created."

Here's how to solve this in Microsoft SQL Server Management Studio 2008 through 2014:

1) Go into the Tools...Options... menu

2) In the popup, on the left, expand "Designers" by clicking on the plus

3) In the "Table Options" shown to the right, MAKE SURE that "Prevent saving changes that require table re-creation" is NOT checked.

I've lost a few hours of my life to this nuisance. Hope this will help someone else out of the conundrum.

Labels: ,

Friday, June 10, 2011

ODBC workaround (Access to SQL Server) for 64-bit Windows 7

I've been avoiding 64-bit Windows due to various incompatibility rumors, but this case takes the cake, as it is entirely Microsoft's fault. My work place uses a variety of shared Access databases, located on a network drive, that connect via ODBC System DSN's to a SQL Server 2008.

Even though all DSN's appeared to be configured correctly on my colleague's brand new (64-bit) Windows 7 machine, and the ODBC connections passed their test, the actual database declined to connect to the server. Thanks to various discussion groups, we finally figured out that the graphical user interface accessible from the Administrative Tools applet in Control Panel actually brings up a 64-bit ODBC application, whereas we (for backwards compatibility) needed the strangely hidden 32-bit System DSN window. To run it, we had to browse to this path:

C:\Windows\SysWOW64\odbcad32.exe

Clicking on odbcad32.exe runs the 32-bit version of the ODBC connection setter upper. There, we re-created all the System DSN's, and finally the Access databases were happy.

ODBC

By default, the Windows GUI is presenting a 64-bit ODBC connection setter upper (which I believe is in the C:\Windows\system32 path somewhere. Going manually to the first path and running the application, then adding the ODBC connections, makes it work.

In the meantime, 3 days of work were lost to this problem.

Labels: , , , ,

Thursday, May 20, 2010

The challenge of printing a .NET textbox

On Yahoo Answers, someone asked how to print the content of a RichTextBox control in .NET. I did finally manage to print a (simpler) TextBox control in .NET, and that was difficult enough. I am documenting that here for myself or anyone else
facing this challenge.

First, a little rant. Before the .NET framework was released (~2000), Microsoft provided a handy .print method on the RichTextBox and TextBox controls, and all a programmer needed to do was call it. But in Microsoft's almighty wisdom (and trying to be just like Java), the simple and highly useful .print method was removed in .NET, and now you have to do all the following steps successfully. And note, it's just as difficult in Java--what were the language developers thinking? I imagine they were thinking to provide maximum flexibility, but why not provide a quick-and-dirty out for the rest of us?

In the example below, my form (called JobForm) is printing the contents of a TextBox control (called textBoxRight).

STEP 1: DECLARE VARIABLES

You need a bunch of special fields in your form code to keep track of printing information. To get started, just use this:

#region printing declarations

        private Font midlistRegular = new Font(
            "San Serif",
           (float)7.8,
           FontStyle.Regular,
           GraphicsUnit.Point);

        private Font midlistRegular1 = new Font(
            "San Serif",
           (float)7.6,
           FontStyle.Bold,
           GraphicsUnit.Point);

        private int printMargin = 1;

        private int lastPosition = 0;

        private int lastIndex = 0;

        /// 
        /// Set in Paint (of form) for use when printing
        /// 
        private float screenResolutionX = 0;

        #endregion printing declarations

STEP 2: INSTANTIATE A PRINTDOCUMENT OBJECT AND CREATE ITS EVENT CODE

You need a special object that raises an event each time one page has been printed and causes the next page to be printed. It is a non-visible control and is called "PrintDocument" (in library System.Drawing.Printing).

In the Windows Designer, drag a non-visible "PrintDocument" control onto your form (System.Drawing.Printing.PrintDocument). It will be instantiated on your form as "printDocument1". Double-click the PrintDocument control on the form to create the "PrintPage" event and give it the following code (using your TextBox name instead off "textBoxRight"):

private void printDocument1_PrintPage(object sender, PrintPageEventArgs e)
        {
            try
            {
                e.HasMorePages = this.PrintOnePage(
                    e.Graphics,
                    this.textBoxRight,
                    this.printDocument1,
                    this.screenResolutionX);
            }
            catch (Exception ex)
            {
                MessageBox.Show(ex.Message);
            }
        }

STEP 3:

Now create the PrintOnePage() method needed by the above code. Although the display truncates this code, if you copy it using Control-C, all the code will be grabbed. Use the boilerplate code below unchanged (and I apologize that it's so ugly):

/// 
        /// Goes through each line of the text box and prints it
        /// 
        private bool PrintOnePage(Graphics g, 
                 TextBox txtSurface, PrintDocument printer, 
                 float screenResolution)
        {
            // Font textFont = txtSurface.Font;
            Font textFont = 
                   new Font("San serif", 
                            (float)10.0, FontStyle.Regular, 
                            GraphicsUnit.Point);

            // go line by line and draw each string
            int startIndex = this.lastIndex;
            int index = txtSurface.Text.IndexOf("\n", startIndex);

            int nextPosition = (int)this.lastPosition;
            // just use the default string format
            StringFormat sf = new StringFormat();

            // sf.FormatFlags = StringFormatFlags.NoClip | (~StringFormatFlags.NoWrap );
            // get the page height
            int lastPagePosition = (int)(((printer.DefaultPageSettings.PaperSize.Height / 100.0f) - 1.0f) * (float)screenResolution);
            // int resolution = printer.DefaultPageSettings.PrinterResolution.X;

            // use the screen resolution for measuring the page
            int resolution = (int)screenResolution;

            // calculate the maximum width in inches from the default paper size and the margin
            int maxwidth =
                (int)((printer.DefaultPageSettings.PaperSize.Width / 100.0f - this.printMargin * 2) * resolution);

            // get the margin in inches
            int printMarginInPixels = resolution * this.printMargin + 6;
            Rectangle rtLayout = new Rectangle(0, 0, 0, 0);
            int lineheight = 0;

            while (index != -1)
            {
                string nextLine = txtSurface.Text.Substring(startIndex, index - startIndex);
                lineheight = (int)(g.MeasureString(nextLine, textFont, maxwidth, sf).Height);
                rtLayout = new Rectangle(printMarginInPixels, nextPosition, maxwidth, lineheight);
                g.DrawString(nextLine, textFont, Brushes.Black, rtLayout, sf);

                nextPosition += (int)(lineheight + 3);
                startIndex = index + 1;
                index = txtSurface.Text.IndexOf("\n", startIndex);
                if (nextPosition > lastPagePosition)
                {
                    this.lastPosition = (int)screenResolution;
                    this.lastIndex = index;
                    return true; // reached end of page
                }
            }

            // draw the last line
            string lastLine = txtSurface.Text.Substring(startIndex);
            lineheight = (int)(g.MeasureString(lastLine, textFont, maxwidth, sf).Height);
            rtLayout = new Rectangle(printMarginInPixels, nextPosition, maxwidth, lineheight);
            g.DrawString(lastLine, textFont, Brushes.Black, rtLayout, sf);

            this.lastPosition = (int)screenResolution;
            this.lastIndex = 0;
            return false;
        }

STEP 4: ADD CODE TO YOUR FORM'S PAINT EVENT

In Windows Designer, open your form in graphical view mode. Open the form's Properties Window and click the lightning bolt to see events. Double-click on the form's Paint event to create it, and paste the boiler-plate code from my Paint event below into your form's paint event (your event will have a different name, using your form's name, than mine does below):

private void JobForm_Paint(object sender,
                            System.Windows.Forms.PaintEventArgs e)
        {
            // save the form height here
            this.screenResolutionX = e.Graphics.DpiX;

            // set the last position of the text box
            this.lastPosition = (int)this.screenResolutionX;
        }

STEP 5: ACTUALLY PRINT THE TEXTBOX

To actually print the contents of the TextBox, you'll need code like this in a print menu or button event:

PrintDialog printDialog1 = null;
 printDialog1 = new PrintDialog();
        if (printDialog1.ShowDialog() == DialogResult.OK)
        {
             this.printDocument1.PrinterSettings = printDialog1.PrinterSettings;
             this.printDocument1.Print();
        }

I haven't paid huge attention to all the above code, once I got it working. I snarfed much of it from various sources on the web (thank you, bloggers!). It could be enhanced or cleaned up a lot. Maybe this will help another programmer get it done.

Labels: ,

Thursday, March 04, 2010

Wrestling to sort the ASP.NET GridView

It's supposed to be easy, and practically codeless to use, and it is--sometimes. When a GridView is to be populated with the same dataset every time the page loads. But I had a drop-down list where a condition had to be selected, and based on that, the grid then had to be populated. I made it to the point where I got the dropdown selecting, and the grid populating, but there were 2 problems: paging didn't work, and sorting didn't work. I decided to turn paging off, so sorting was my last remaining issue.

Umpteen useless web articles later, I resorted to the paper books stashed on my shelf at home. First stop was Murach's ASP.NET 2.0, which is alleged to be so good. But it held no love for me. Second stop was Dino Esposito's "Programming Microsoft ASP.NET 2.0: Core Reference"--and finally, I got the help I needed.

I'm blogging about this because a good book deserves real credit. Many mysteries were unraveled by the Esposito book, including that I needed to explicitly re-populate the GridView when its "Sorting" event fired. Esposito's directions were extremely explicit: use the "Sorting" event's GridViewArgEvents ("e" parameter) to find out the sort key, and write a special stored procedure that uses the sort key to ORDER the data differently. These last bits of information were the treasure that finally allowed me to get sorting to work.

I'm posting a copy of the rather odd-looking stored procedure that I ended up using below for your edification. The "@order_by" parameter names the column on which to sort, and the odd way of constructing the query from strings allows brackets to fit around any strange or keyword column names:

CREATE PROCEDURE [dbo].[lsp_NRSA_find_counts_and_person_by_naded] 
@naded_id int,
@order_by nvarchar(12)
AS

BEGIN

 IF @order_by = ''
 BEGIN
  SET @order_by = 'slide'
 END
 
 EXEC ('SELECT * ' + 
 'FROM vw_NRSA_cemaats_by_count_and_person ' +
 'WHERE naded = ' + @naded_id + 
 ' ORDER BY [' + @order_by + ']')

END

Labels: , ,

Tuesday, November 10, 2009

Sql Server Management Studio 2008 logon persistence problem

I very often open SQL Management Studio 2008 for the same server. Problem is, the very first time I accessed this server, I did so using a different user name and password. Forever after henceforth, SQL Management Studio refused to remove this old server entry from my dropdown list in the "Connect to Server" popup window.

This made it necessary to click to change the user and enter a new password about 80 times per day--all because this premium, allegedly very smart sophisticated software has decided never to forget a former entry--even though I have deleted the Registered Server and recreated and so forth. I'm frankly angry about this nuisance, which wouldn't matter if I didn't have to open and close the tool so many times per day.

If anyone has any concrete suggestion on how I can defeat this thing, please let me know. I see from the internet that I am not the only one having this frustration, but I have yet to find any blog entry or suggestion to alleviate the problem.

UPDATED with an answer in Aug. 2010:

For Sql Server Management Studio 2008 on Windows XP, to restart all your logins, delete the file:

C:\Documents and Settings\%username%\AppData\Roaming\Microsoft\Microsoft SQL Server\100\Tools\Shell\SqlStudio.bin

or possibly:

C:\Documents and Settings\[user]\Application Data\Microsoft\Microsoft SQL Server\100\Tools\Shell

----

For Sql Server Management Studio 2008 on Windows 7, to restart all your logins, delete the file:

C:\Users\%username%\AppData\Roaming\Microsoft\Microsoft SQL Server\100\Tools\Shell\SqlStudio.bin

Labels: , ,

Monday, September 28, 2009

Delete a maintenance plan after changing SQL Server 2008 name


I've seen a few posts on how to delete an older maintenance plan after you've changed the computer name for a default installation of SQL Server 2008, but none of the posts fully solved the problem. Below is what I had to do--with the caveat that you have to find your own id's:

USE msdb

DELETE 
FROM dbo.sysmaintplan_log 
WHERE subplan_id = '36F8247F-6A1E-427A-AB7D-2F6D972E32C1'

DELETE 
FROM dbo.sysmaintplan_subplans 
WHERE subplan_id = '36F8247F-6A1E-427A-AB7D-2F6D972E32C1'

DELETE 
FROM dbo.sysjobs 
WHERE job_id = '3757937A-02DB-47A6-90DA-A64AE84D6E98'

DELETE 
FROM dbo.sysmaintplan_plans 
WHERE id = 'C7C6EFAA-DA4D-4097-9F9F-FC3A7C0AF2DB'

Labels: ,

Sunday, May 03, 2009

C# code to get schema of an Access table

I just wanted to dump the schema of a table in Microsoft Access. There is a lot of code on the web which purports to do this, but most of it didn't actually work, and most of it was not in C# (my current preferred language). So I am posting this here for anyone that needs it. Just change the path to the database file in the first line of code, and the name of the table ("taxa" in my example) in the second line of code. This program dumps the table schema to a file created by the LogFile object (also attached below) located in the application folder. You'll need to add "using System.Data.OleDb;" to the top of the file. Or, just download the code here. The code:




OleDbConnection conn =
 new OleDbConnection(
    "Provider=Microsoft.ACE.OLEDB.12.0;Data Source=" +
    "C:\\phycoaide\\phycoaide.mdb;Persist Security Info=False;");

// retrieving schema for a single table
OleDbCommand cmd = new OleDbCommand("taxa", conn);
cmd.CommandType = CommandType.TableDirect;
conn.Open();
OleDbDataReader reader =
 cmd.ExecuteReader(CommandBehavior.SchemaOnly);
DataTable schemaTable = reader.GetSchemaTable();
reader.Close();
conn.Close();

LogFile.WriteLine(" ");
foreach (DataRow r in schemaTable.Rows)
{
 LogFile.WriteLine(" ");
 foreach (DataColumn c in schemaTable.Columns)
 {
    LogFile.WriteLine(c.ColumnName + ": " + r[c.ColumnName]);
 }
}
MessageBox.Show("done");


The LogFile class creates a file in the application folder from which your program runs:




// 
// No copyright; free for reuse by anyone
// 
// Pat G. Palmer
// ppalmer AT harbormist.com
// 2009-05-04
// Opens a single-threaded log file to trace execution
namespace Ansp
{
    using System;
    using System.IO;            // file readers and writers
    using System.Windows.Forms; // Application object

    /// 
    /// Singleton that appends log entries to a text file
    /// in the application folder.  If the file grows
    /// to be too large, it deletes itself and starts over.
    /// The file is kept open until the application ends
    /// and implements the "dispose" pattern in case things
    /// do not end gracefully.
    /// 
    public class LogFile : IDisposable
    {
        private static int maxsize = 470000;
        private static string fileSuffix = "_log.txt";
        private static string fileSpecification;
        private static StreamWriter filewriter;
        private static LogFile instance;

        private LogFile()
        {
        }

        ~LogFile()
        {
            this.Dispose(false);
        }

        public static void InitLogFile()
        {
            if (instance == null)
            {
                instance = new LogFile();
            }

            string stringMe = "InitLogFile: ";
            try
            {
                if (Application.ProductName.Length == 0)
                {
                    fileSpecification = Application.StartupPath + "\\" +
                       "Test" + fileSuffix;
                }
                else
                {
                    fileSpecification = Application.StartupPath + "\\" +
                       Application.ProductName + fileSuffix;
                }

                // restart file if too big
                if (File.Exists(fileSpecification))
                {
                    FileInfo myFileInfo = new FileInfo(fileSpecification);
                    if (myFileInfo.Length > maxsize)
                    {
                        File.Delete(fileSpecification);
                    }

                    myFileInfo = null;
                }

                // restart file with appending
                filewriter = new StreamWriter(
                   fileSpecification, true, System.Text.Encoding.UTF8);

                // start log with standard info
                WriteLine("\r\n---------------------------------------------");
                string tempString = stringMe +
                    Application.ProductName + " " +
                    Application.ProductVersion +
                    "log opened at " + 
                    DateTime.Now;
                WriteLine(tempString);
                WriteLine(stringMe + "username=" + SystemInformation.UserName);
                WriteLine(stringMe + Application.StartupPath);
            }
            catch
            {
            }
        }

        public static void WriteLine(string myInputLine)
        {
            try
            {
                if (instance == null)
                {
                   InitLogFile(); // first time only
                }
                if (myInputLine.Length != 0)
                {
                   filewriter.WriteLine(myInputLine);
                   filewriter.Flush(); // update file
                }
            }
            catch
            {
            }
        }

        public static void Close()
        {
            instance.Dispose();
        }

        /// 
        /// Implement IDisposable.
        /// Do not make this method virtual.
        /// A derived class must not override this method.
        /// 
        public void Dispose()
        {
            this.Dispose(true);
            //// Now, we call GC.SupressFinalize to take this object
            //// off the finalization queue and prevent finalization
            //// code for this object from executing a second time.
            GC.SuppressFinalize(this);
        }

        private void Dispose(bool disposing)
        {
            if (disposing)
            {
                // no managed resources to clean up
            }
            if (instance != null)
            {
                if (filewriter != null)
                {
                    try
                    {
                        filewriter.Flush();
                        filewriter.Close();
                    }
                    catch
                    {
                    }

                    filewriter = null;
                } // end if filewriter not null
            } // end if instance not null
        }

    } // end class LogFile()
} // end namespace

Labels: ,

Thursday, December 27, 2007

Suppressing VS.NET Compiler Warnings (2005, 2008)

"1591" was the magic string that I needed, and this is the sorry tale of how to find that out.

As a rule, I don't like suppressing compiler warnings, but there is a time for everything. My time came when I inherited a huge mass of ill-behaving C# code and began adding XML comments. I was immediately overwhelmed by hundreds of warnings that said "Missing XML comment for publicly visible type". I wanted to compile the comments that I had added without being nagged by the compiler for not having added the remaining 300 possible XML comments as well.

I knew that Visual Studio 2005 would let me suppress specific warnings in the project build properties. However, I didn't know the warning number that I needed to supply. Microsoft, in their great goodness, has suppressed showing of warning numbers--they only show the text. A few googles later, I knew that it was either 1591 or CS1591, but no one told me anywhere, in general, now to find the full list of warning numbers. I've wanted this list many a time in the past, so I set out to find out, once and for all.

Eventually, I found that I needed to start at the top-level C# Reference page in MSDN2 (for the appropriate version of VS.NET), then search on "compiler warning " + "warning text". So searching on "compiler warning missing XML comment" got me the precious warning number that I needed, which is CS1591. But then I had to psychically understand, of course, that the CS must be left off, and only the 1591 entered.

See my glorious build screen which finally suppressed the evil hundreds of unwanted warnings:


screenshot

UPDATE in Oct 2008: Now that I am using Visual Studio 2008, I have learned that I can right-click over a warning in the Error List pane, and it will pop up documentation about the warning that includes its error level and number, and from that, I can derive the 4 digits to place in the suppress box of the project Build properties. It is not necessary to search on the Microsoft website. I don't know if this feature was present in Visual Studio 2005 (and I just didn't know it), or not.

Labels: ,