Skip to main content

Posts

Showing posts from 2016

Is your SQL Server Database backup good enough? Can it save you during a disaster?

Met a DBA from a company who are in business for ~2 years. During the conversation came to know they have never tested their backup file - not even once. Also, they have very rarely used DBCC CheckDB command. Should I call it as surprised or shocked? It has become common to see Database Administrators to have fancy DB Backup plan, automate it and that's it. Their idea is whenever the need arises (read as disaster strikes) they can make use of it and restore the database. Although theoretically sounds like a good plan it actually isn't. Why? All those efforts to take regular backups would become completely useless if those files can't be used to recover the database. One important question missed by many companies to whom I have consulted for - "Do we regularly make sure that we are able to restore a database from our backups?". Despite backup process has succeeded how do we know it isn't corrupted (or) has some issues which don't allow it to be rest

Positives of Cyclone Vardah - Unity, Humanity, Self-help

Chennai, Cyclone Vardah (Dec 12, 2016) - In person, have never witnessed anything as close to this in my life. The winds were so powerful it has uprooted close to 15 trees in our colony, the howling sound of the winds blowing was so scary which gives me goosebumps even thinking of it now. It was a complete carnage by "Vardah" on that day. Our colony had ~50 trees and out of which we lost 15 odd trees now. Minimum of 2 trees has fallen in each street. Colony compound wall has collapsed as one tree had fallen straight over it. Everywhere we were able to see only leaves, fallen branches, fallen trees, cut cables and not an inch of the actual road was clearly visible. Bad things do happen in the world, like war, natural disasters, disease. But out of those situations always arise stories of ordinary people doing extraordinary things - Daryn Kagan The positive side of it was from each street at least 2 members automatically volunteered and in no time we were a solid team

How I got billed even though I had registered as an AWS Free Tier user

Amazon AWS offers 12 months free tier to get started with their services. At a high level went through this link and created a free tier account to play around. Approximately after a month checked the Billing dashboard and was surprised to see that I am getting billed $0.81 for some usage under the heading "Elastic Cloud Compute" (EC2). Points mentioned about EC2 in AWS Free tier - 12 Months Introductory Period So I was mindful of those numbers mentioned there alone. But only after seeing the detailed bill understood that there are few items which are getting billed on an hourly basis even within Free Tier. Billing Dashboard: Billing Summary Billing Details Looks like NAT Gateway & Elastic IP address usage are charged even in Free Tier. $0.056 per GB Data Processed by NAT Gateways $0.056 per NAT Gateway Hour (I had used it for 6 hrs) $0.005 per Elastic IP address not attached to a running instance per hour on a pro rata basis (I had use

From a Windows Machine - Connect (ssh) to Linux Instances running in a Private Amazon VPC

To start with the following tools need to be downloaded : 1. PuTTY 2. PuTTYGen 3. Pageant If you are a Windows user & trying to connect to (SSH into) an AWS EC2 instance then you need to use PuTTY. During the process of provisioning an EC2 instance, you would have created/downloaded a key pair file which would be in a .pem format. But PuTTY doesn't support the .pem format and it needs the key pair file to be in .ppk file format. That's where PuTTYGen comes into play. PuTTYGen helps in converting a .pem file into a .ppk file with a click of a button. Connecting to an EC2 instance in a public subnet: 1. Open PuTTyGen 2. Load >> Choose the .pem which you want to convert 2.1 [Optional] Provide a key passphrase & confirm passphrase. For simplicity sake, I skip it for now. 3. Click on "Save private key" 4. Open Putty 5. Enter the hostname / IP - For ex: ec2-user@35.154.74.77 6. Copy paste that into Saved Sessions textbox as well 7.

AWS fatal error: An error occurred (400) when calling the HeadObject operation: Bad Request

While using AWS and trying to copy a file from a S3 bucket to my EC2 instance ended up with this error message. Command Used: aws s3 cp s3://mybucketname/myfilename.html /var/www/html/ Error: fatal error: An error occurred (400) when calling the HeadObject operation: Bad Request The error goes off if we add the region information to the command statement. I am using Asia Pacific (Mumbai) so used ap-south-1 as the region name. Modified Command: aws s3 cp s3://mybucketname/myfilename.html /var/www/html/ --region ap-south-1

Is Auto Close Enabled in your SQL Server database?

In SQL Server, one of the Database Properties options is Auto Close . This is an option to be used (auto close = True) if our intention is to shut down cleanly and free the resources once the last user accessing that database exits. If after the last user exits we still want to keep the database alive without shutting down then set auto close as FALSE. But having this setting enabled in a Production environment will end up with performance issues majority of the times. Why? Because once the DB is closed all cached items (data / procedure cache, execution plans) will be flushed out as well. So it has to work from the scratch again when the next user connects in. How to know what is it set to? Option 1: Try this query which will return all databases in that server where Auto Close is set to TRUE. SELECT [name] AS [DatabaseName] FROM SYS.databases  WHERE is_auto_close_on = 1 --To change AUTO CLOSE option to FALSE USE [master] GO ALTER DATABASE [TEST] SET

Creating & Executing SQL Queries in Visual Studio Code

Visual Studio Code is a modern, lightweight (but powerful) source code editor which runs on your desktop and is available for Windows, Mac, and Linux. A new extension has been announced which would support creating and executing SQL queries from within Visual Studio code editor. Setting up the environment: 1. Download and install Visual Studio code from here -  https://code.visualstudio.com/Download 2. Next you need to install the extension "mssql" on it. Refer the steps here in GitHub  or Microsoft Documentation As explained in the above link bare minimum these steps should have been completed: Installation of Visual studio code and mssql extension Creating the SQL Server Connection Profile Changing the language of the file to "SQL" Executing the query: 1. It's simple, just type the query you want and to execute it press "CTRL + SHIFT + e" 2. Results can also be saved as a .csv, .json file Sample output:

Enable thumbnail preview in Windows 10

By default "Always show icons, never thumbnails" option would be enabled in Windows 10. Just follow these two steps to change that setting. 1. Go to windows explorer and click on "Options" 2. Uncheck the "Always show icons, never thumbnails" option That's it just click on OK and you should be able to see the thumbnail of images.

To run dex in process, the Gradle daemon needs a larger heap. It currently has approximately 910 MB

Warning / Error: To run dex in process, the Gradle daemon needs a larger heap. It currently has approximately 910 MB Solution which worked for me : Android Studio 2.1.* has a feature named Dex In Process. To make use of it we need to update our gradle.properties file and increase the amount of memory allocated to the Gradle Daemon VM to a minimum of 2 GB. So just added the below line in my project gradle.properties org.gradle.jvmargs=-Xmx2048m

List all stored procedures which are modified recently in a SQL Server database

In SQL Server Management Studio: Navigate to your database > Programmability  Press F7 (or) View > Object Explorer Details In the Object Explorer Details window right click anywhere on the header row and select "Date Last Modified" field (in case it isn't selected already) T-SQL Script to achieve the same result: Option 1: SELECT     name,     create_date,     modify_date FROM sys.procedures WHERE modify_date > '2016-08-09' ORDER BY modify_date DESC ; Option 2: SELECT specific_name, created, last_altered FROM INFORMATION_SCHEMA.ROUTINES WHERE routine_type = N'PROCEDURE' and last_altered > '2016-08-09'  ORDER BY last_altered DESC ;

Transform your fear to success

During my days as an MVP (Microsoft Most Valuable Professional), I used to mentor few juniors from the community. One of them used to be very low on confidence and fear of failure was hurting him more than anything. Recently while cleaning up my mailbox found few of the old emails which I have sent him - cleaned it up a bit and just compiled all those into this  article .