Office Exhange help Blog

Currency Exchange & Money Transfer software

kentico rss control

clock December 15, 2010 23:06 by author Admin
using System; using System.Xml; using System.Collections.Generic; using System.Data; using System.Web; using System.Web.UI; using System.Web.UI.WebControls; public partial class WebUserControlRepeater : System.Web.UI.UserControl { protected void Page_Load(object sender, EventArgs e) { XmlTextReader reader = new XmlTextReader("http://www.officeexchange.net/CMSPages/NewsRss.aspx"); DataSet ds = new DataSet(); ds.ReadXml(reader); XmlDataSource1.Data = ds.GetXml(); } } ###################
<%#XPath("item/title")%> <%#XPath("substring(item/description, 0, 13)")%>


Backup & restore database

clock October 5, 2010 15:19 by author Admin

Automate Your Backup and Restore Tasks

By Philip Horan, 2010/10/05

Total article views: 1812 | Views in the last 30 days: 1812

I was looking for a simple way to schedule a job to perform a nightly backup of the production system and restore it to a dev system. I know there are many ways of achieving my goal and I am sure my solution will be frowned upon by some (if not all J) but it works for me so I thought I would share my experience with others.

For my scenario the requirement was quite straightforward. Backup a single 5GB database then restore to an existing dev system (over writing the existing dev system). This article relates to SQL Server 2005 Standard 64bit (version 9.00.3353) running on Windows Server 2003 R2 Standard x64 (SP2).

I started as I do with most things by over complicating the solution! Initially I planned on creating a job with 3 steps. The first step executed a .vb script. The script traverses the directory provided for the ‘strSrcPath’ parameter, deleting all files that are older than the value provided for the ’intMaxDate’ parameter.

My first a job included 3 steps and is scheduled to run nightly:

  1. Delete previous backups
  2. Backup Production System
  3. Restore to Development System

Step 1 (Cleanup .vb script)

Here was my first attempt:

Option Explicit
Const strSrcPath = "C:\Restore"      ' subdirectory to clean
Const intMaxDate = 0             ' change days to keep here
Dim objFSO
Set objFSO = CreateObject("Scripting.FileSystemObject")
TraverseFolder strSrcPath, intMaxDate
Sub TraverseFolder(strSrcPath, intMaxDate)
                Dim objCurrentFolder
                Set objCurrentFolder = objFSO.GetFolder(strSrcPath)
                On Error Resume Next
                Dim objFile, objFolder
                For Each objFile In objCurrentFolder.Files
                                If DateDiff("d", CDate(objFile.DateLastModified), Now) > intMaxDate Then
                                                objFSO.DeleteFile objFile
                                End If
                Next
                For Each objFolder In objCurrentFolder.subFolders
                                TraverseFolder objFolder, intMaxDate
                Next
End Sub

When I ran the job SQL returned error 14277.

Error

The command script does not destroy all the objects that it creates. Revise the command script. (Microsoft SQL Server, Error: 14277)

The error was generated because the objects that were created by the script were not removed after execution. To resolve I simply appended the following 2 lines to the code:

Set objFSO=Nothing
Set objCurrentFolder=Nothing

The working .vb script is as follows:

Option Explicit
Const strSrcPath = "C:\Restore"      ' subdirectory to clean
Const intMaxDate = 0             ' change days to keep here
Dim objFSO
Set objFSO = CreateObject("Scripting.FileSystemObject")
TraverseFolder strSrcPath, intMaxDate
Sub TraverseFolder(strSrcPath, intMaxDate)
                Dim objCurrentFolder
                Set objCurrentFolder = objFSO.GetFolder(strSrcPath)
                On Error Resume Next
                Dim objFile, objFolder
                For Each objFile In objCurrentFolder.Files
                                If DateDiff("d", CDate(objFile.DateLastModified), Now) > intMaxDate Then
                                                objFSO.DeleteFile objFile
                                End If
                Next
For Each objFolder In objCurrentFolder.subFolders
                                TraverseFolder objFolder, intMaxDate
                Next
Set objFSO=Nothing
Set objCurrentFolder=Nothing
End Sub

Step 2 (Backup Production System)

The backup is invoked by a job which is scheduled to run daily at 0500. The backup is performed on the same server that hosts the SQL Database.

BACKUP DATABASE MyDatabase
TO DISK = 'C:\Restore\Dev.bak'

Step 3 (Restore to Development System)

The restore operation again is on the same server that hosts the database and overwrites the existing dev system.

RESTORE DATABASE MyDatabase
FROM DISK = 'C:\Restore\Dev.bak'
WITH REPLACE

This got me thinking. I only had the delete .vb script as my backup was appending to the exiting file.

I made a simple change to the backup script, simply adding the WITH INIT option.

BACKUP DATABASE MyDatabase
TO DISK = 'C:\Restore\Dev.bak'
WITH INIT

Here is an explanation of the INIT option taken from Microsoft TechNet:

If the volume contains a valid media header, performs the following checks:

Verifies the media password.2

If MEDIANAME was specified, verifies that the given media name matches the media header's media name.

Verifies that there are no unexpired backup sets already on the media.

If there are, terminates the backup.

If these checks pass, overwrites any backup sets on the media, preserving only the media header.

If the volume does not contain a valid media header, generates one with using specified MEDIANAME, MEDIAPASSWORD, and MEDIADESCRIPTION, if any.

I no longer needed step 1 as the INIT option overwrites the existing backup set.

OK, I now have a very straight forward backup and restore job. This got me thinking again. What if somebody is connected to the Dev system when I am trying to restore it?

To identify a script that would disconnect any active processes against the dev system I searched online using Google and found exactly what I was looking for (creator anonymous on blog response so no credit can be given). Blog url: http://blog.tech-cats.com/2008/01/kill-all-database-connections-to-sql.html

Having originally started with a job that contained 3 steps and then removing the Cleanup .vb script to have only 2 steps (backup and restore) I reverted back to 3 steps with the addition t-sql script ‘kill all database connections’.

1. Backup Production System

2. Kill Connections to Dev System

3. Restore to Development System

The kill script is as follows:

-- Create the sql to kill the active database connections
declare @execSql varchar(1000), @databaseName varchar(100)
-- Set the database name for which to kill the connections
set @databaseName = 'MyDatabase'
set @execSql = ''
select @execSql = @execSql + 'kill ' + convert(char(10), spid) + ' '
from master.dbo.sysprocesses
where db_name(dbid) = @databaseName
and
DBID <> 0
and
spid <> @@spid
exec (@execSql)
GO

The job ran fine for a few days and then failed at step 2. Reviewing the job history returned Sql Message ID 6107

Message
Executed as user: Domain\User. Only user processes can be killed. [SQLSTATE 42000] (Error 6107). The step failed.

After a little research I discovered that you cannot kill system processes. Any SPID less than 50 is generally a system process. To resolve I modified the ‘Kill’ script to include additional filters.

AND status <> 'background' AND status IN ('runnable','sleeping')

The ‘Kill’ script now looks as follows:

-- Create the sql to kill the active database connections
DECLARE @execSql varchar(1000)
DECLARE @databaseName varchar(100)
-- Set the database name for which to kill the connections
SET @databaseName = 'DevSystem'
SET @execSql =
''SELECT @execSql = @execSql + 'kill ' + CONVERT(CHAR(10), spid) + ' '
from master.dbo.sysprocesses
WHERE db_name(dbid) = @databaseName
AND status <> 'background'
AND status IN ('runnable','sleeping')
AND DBID <> 0
AND spid <> @@spid
EXEC (@execSql)

To conclude I have a scheduled job that performs 3 tasks that allow me to automate back-up and restore of my production system to my dev system.

  • Back-up the production system
  • Kills connections prior to restore
  • Restores the backup to the dev system

So that concludes my Backup and Restore task. This article was driven by a recent vacation. Prior to my vacation I was asked to restore the production system to the dev system at month end. However I was going to be thousands of miles away when they wanted it done. As a novice DBA I am constantly looking for ways to improve how I manage my growing SQL environment and accommodate end user requests. I hope other novice DBA’s like I will find this article of use.



band entities

clock September 21, 2010 21:36 by author Admin
http://www.osfi-bsif.gc.ca/app/DocRepository/1/eng/issues/terrorism/entstld_e.xls
http://www.ustreas.gov/offices/enforcement/ofac/faq/answer.shtml

Step 1. Is the “hit” or “match” against OFAC’s SDN list or targeted countries, or is it “hitting” for some other reason (i.e., “Control List” or “PEP,” “CIA,” “Non-Cooperative Countries and Territories,” “Canadian Consolidated List (OSFI),” “World Bank Debarred Parties,” “Blocked Officials File,” or “government official of a designated country”), or can you not tell what the “hit” is?


recovery

clock September 15, 2010 22:48 by author Admin
Assuming the recommended FULL recovery mode strategy is followed, set the database log file to be truncated when the checkpoint command is issued. The checkpoint command will be issued as a part of the backup script that should be run regularly (this process is described later) NOTE – ‘MyDatabaseName’ should be replaced with the name of the actual database USE MASTER EXEC sp_dboption @dbname = 'MyDatabaseName', @optname = 'trunc. log on chkpt.', @optvalue = 'true'


To determine the recovery mode of existing databases:

clock September 15, 2010 22:45 by author Admin
SELECT name, recovery_model_desc FROM sys.databases


My Analysis Services Interview Questions

clock August 19, 2010 06:54 by author Admin

Dave Rodabaugh's Analysis Services interview questions

My Analysis Services Interview Questions

My good friend Jon Baker suggested that I start a blog on this site, and as I tend to follow the advice of those smarter than I, here it is.  As noted by the title of my blog, I’m a DW/BI guy with extensive experience using Microsoft SQL Server, Analysis Services, Reporting Services, and soon Integration Services.  I’m in Columbus, Ohio (GO BUCKS!) and like all multidimensional guys, I’m a little crazy.  Not overtly so, of course.  Just off-kilter a little.

One day Jon and I were swapping interview stories.  (He's posted about this very topic at (broken link!) Is this a conspiracy?  Oh yes.) He fired his very best interview question at me and I flunked it in spectacular fashion.  That got me to thinking about how I interview DW/BI candidates, particularly those who claim to know Analysis Services.  We decided that some of these stories were pretty good, so I'll be following this post with my favorite interview questions, and some of the responses I've received.

My last big client was The Home Depot.  For 18 months I traveled between Columbus and Atlanta to help pioneer an Analysis Services implementation in their Enterprise Data Warehouse (“EDW”) group.  In that time we always seemed to be short on talent, so I was asked to interview candidates.  I probably interviewed a dozen candidates in about nine months, most of whom had Analysis Services on their resumes, and most of who really knew little about it or how to use it.

I admit to being a real hammer when conducting an interview.  I take an adaptive approach, by starting with the high-level questions about architecture, business problems, and design philosophies.  If the candidate fails those questions (and most do) then I begin to drill down into their technical knowledge.  Just because a candidate can’t fill the role of architect doesn’t mean they aren’t productive when working under an architect.  I’m sad to report that most people who fail the high-level questions also seem to know very little about Analysis Services’ product features.  What’s shocking is that some of these people are actually giving clients architectural advice!  Why can’t people admit “I don’t know the answer to that question” when peppered in an interview?  Why try to bluster and bluff your way through it?  Do you think I won’t know what you’re doing?!  (Apparently, the pretender detector for most interviewers isn’t very sensitive as many of these people continue to land Analysis Services gigs.) 

This transcends race, gender, nationality, and educational pedigree.  I’ve interviewed Americans, Russians, a Serb, an Iranian, and quite some number of Indians.  I will admit that I’ve had the most trouble with the country presently serving in a large outsourcing role.  I’m pretty sure there are talking points and that candidates are being coached because I see unswerving devotion to their use.  Supposedly, the interviewer will lap up the answer and be convinced of expertise!  As we’ll see, these talking points are vaguely reminiscent of the Underpants Gnomes in a particular South Park episode.

So this is the first in a multi-part series about interviewing Analysis Services candidates.  I offer these stories because they’re humorous and many of you will nod in acknowledgement.  Some of you who read these may think I'm a real blowhard.  Please be assured that I intend no offense to anyone! I've mentored men and women of varying nationalities, and they have taught me a thing or two as well.  I also write these stories because if you’re considering a career in DW/BI using Analysis Services, these can help you.  All of us who pitch our tent in the Microsoft camp do well when Microsoft does well, and we help them by attaining product expertise and plying our craft in the field.  I’ve worked hard to attain expertise (and hopefully I have). If I can do it, so can you.

The next installment of My Analysis Services Interview Questions:  Cool Business Problems.

 

 



 

Part II of My Analysis Services Interview Questions: Cool Business Problems

I’ve seen a variety of techniques used by staffing companies to combat the beating I’ve given their previous candidates.  Though I’ve never been hit by the bait-and-switch, candidates have tried to look up answers during the interview; they’ve had somebody in the room with them; and the last technique seemed to be the group interview.  In addition to the candidate, the phone interview would include the sales rep for the staffing company, and often another technologist.  Why?  To write down the questions, of course!   I can only guess that the last candidate I rejected complained about how unfair the interview was, and the recruiter wants to be sure they get every advantage possible and they think they’ll get a leg up by writing what I ask.  Fair enough, but it won't help you because I’ve learned some great interviewing techniques, like asking follow-on questions based on your first answer.  I understand that the goal for a staffing company is to put cheeks into paying seats, but that's not my goal when I'm interviewing, or when I'm a candidate.

I use an axiom in the classroom and with client personnel who are learning business intelligence:  if you can’t articulate the answer, you don’t know it.  Some will say, “I know the answer.  It’s in my head but I can’t tell you what it is.”  I repeat:  if you can’t articulate something, you don’t know it.  The only evidence of knowledge is its articulation, so let’s lay out a couple of rules.

  • Rule 1:  If you can’t articulate something, you don’t know it.
  • Rule 2:  If you think you know it but just can’t communicate it, refer to Rule 1.

I always start with high-level business questions.  I believe that technical knowledge is the easiest of all skills to acquire.  Writing, speaking, architecting, and mentoring are far more difficult.  Complex business problems require passionate people with creative solutions.  The high-level questions explore all of these at one time.  I want to know if you’re a thinker and a learner, squeezing as much as possible from your experiences.

I usually lead with the following:

“Tell me about some great business problems you’ve solved with Analysis Services.”

I expect to hear an answer like this:  “I used Analysis Services to provide certification supplier metrics for a retail supply chain.  The database and cube(s) that provided these answers had 45 dimensions, about 10 measures, and 25 calculated members.  My favorite metric was a ratio of dollars ordered to dollars received because the business could ensure they were getting what they paid for. Additionally, they were able to reduce back office labor for receiving because they could avoid detail receiving for suppliers they knew were doing well.”

Some of the answers I’ve actually received are downright comical.

“I used surrogate keys.” That’s very nice, but it’s hardly a business problem.

“I built a star schema.” Stating an obvious architecture is not evidence that you really know what you’re doing.

“I used MDX.” That’s good!  Perhaps you used dimensions and cubes as well, but quoting product features provides no evidence of problem solutions.

But my favorite is a maddening three-part answer vaguely reminiscent of the Underpants Gnomes in a certain South Park episode.  The Underpants Gnomes were “experts” in big business because “underpants are big business!” The Underpants Gnomes had three steps:

  • Step 1:  Collect underpants.
  • Step 2:  ?
  • Step 3:  Profits!

Here’s what I get as the most common answer to the question about business solutions.  “First, I talk to the business and gather requirements.  Then I make a design.  Then I build software.”  See?  It IS the Underpants Gnomes, who would say it like this:

  • Step 1:  Collect business requirements.
  • Step 2:  Make a design.
  • Step 3:  Build software.

Of course, this isn’t a business solution.  It’s an approach to solving problems and it’s so obvious that nobody should have to say it.  It would be like going to a staff meeting and hearing a low-level manager lecture you for 15 minutes on the importance of profit.  (Duh.  And yes, I’ve experienced this with a prior employer.  Note I said "prior.")

It may be obvious, but I hear it repeated so many times. Over and over and OVER again.

In an earlier installment of this blog I noted the existence of talking points.  I invariably get these from candidates stemming from one Asian country.  It’s like there’s a four hour seminar that instructs candidates, “Americans LOVE process.  If you tell them this, they’ll see that you’re truly an expert and will have no choice but to take you!”  I’ve had candidates stick to this party line even when I tell them, “That isn’t a business problem.  What interesting business problems have you solved?”  I’ve actually told more than one candidate to stop repeating the line because I don’t want to hear it again.  (I must be the Dr. Laura of BI interviews.)  I usually provide examples in hopes of spurring the candidate to action because sometimes there are linguistic difficulties.  Some candidates respond well after that; some don’t.  In a couple of cases the interview has ended because the candidate can’t deviate from the party line.

Why can’t they answer the question?  There are many reasons.  After teaching SQL Server to adults for more than five years, I’ve taken a very dim view of the average adult’s learning ambitions and thought intensity.  Most people just don’t pay attention and learn lessons.  Others are terrified of talking, but it’s an interview and you must talk.  I think the majority of them simply don’t know the answer to the question but they believe such an admission is failure.  Well, you surely won’t be recommended for an architect position if you can’t tell me what cool business problems you’ve solved, but you may have a personal epiphany that leads to personal growth.

Mostly, candidates responding in this fashion haven’t really used Analysis Services to any great degree even though their resume says they have.  They’ve had success bluffing their way with interviewers who can’t tell the difference.  Personally, I only put on my resume those things in which I am competent or expert but I seem to be a rarity.  In fact, here's some advice:  I don't like 10 page resumes.  I don't mind four or five pages because it's my duty to read your resume and see what you've written.  If you've worked for 10 or 15 years, I expect more than two pages.  But if you get much past five pages, you're either playing cut-as-paste patsy, or you're telling me stuff I don't care about, like what you wrote in FORTRAN or how many NIC's you replaced.  BTW, if you feel the need to tell me that, this doesn't help you and may hurt you. If this is a BI interview then I want to know what you've done that is pertinent to BI. (And don’t get me started on recruiters who don’t actually READ your resume.  I get called frequently for Oracle/Informatica work even though those words don’t appear on my resume.  Yes, many of the same letters in those two words appear on my resume, but not the actual words themselves.  Would you believe that today I got a hit for a UNIX/Sun/Solaris admin?  Huh?  You can't find any of those words on my resume with just a keyword search!  But I digress...)

The lesson here is to pay attention to your experiences.  Wring every last drop of wisdom from your work.  Be prepared to talk about those experiences and what you’ve learned.  You never know when an interviewer will ask you about them!

NEXT.  Part III of My Analysis Services Interview Questions:  Architectural Philosophy

 


 

Part III of My Analysis Services Interview Questions: Architectural Philosophy

In the previous installment of Dave’s Cube, I laid out my first interview question for Analysis Services candidates.  Since I use an adaptive technique, I start with the hard questions and move to technical questions later (if necessary).  My first interview question is always about cool business problems that you’ve solved with Analysis Services.  As noted, that’s rough for most people, but it doesn’t get easier with the second question.  

My second question is always about architectural/design philosophies.  I don’t ask too many closed-ended questions because I want you to talk, so I know what you know.  I usually ask the following question:

“Do you have any particular design philosophies or architectural principles that you will use to guide this client toward best practices?”

Believe it or not, you can say something different than I would without consequence, as long as you can provide your reasoning. (You must also be correct, of course.  Telling me it’s red when I know it’s blue won’t work, even if you have reasons.)  I’m a stickler for best practices because they are the authority upon which you stand when telling a client how best to implement a product they’ve probably never used.   Here are some of the answers I expect to hear:

“I recommend that a client always carry as much detail as possible into the data warehouse and into a cube.  Details may use extra space and CPU cycles, but they ensure the future value of the application because they enable the most calculations to be performed upon that data without rework back into the data warehouse and/or cube.  Details often carry negligible incremental cost on first development.”

“I like to use as many dimensions/attributes as suggested by the data – even if those attributes are not in the business requirements.  The incremental time required to design and populate those extra attributes is usually negligible, but the business will often request those dimensions as filters and slicers for calculations they modify or identify late in the process, or even after a project is closed and customer acceptance is complete.”

“I design first for query performance, and then second for processing performance.  The purpose of an Analysis Services implementation is to serve quantitative answers to end users; query performance is paramount.  Processing performance issues can usually be solved by tuning in the relational layer and by eliminating joins in the cube processing statement.”

These answers tell me two things.  First, you know best practices and you’ve actually used them.  The reasoning behind a philosophy usually only comes from experience.  Second, I know immediately when you are a learner and a thinker because such people answer this question *immediately* and with passion.

I have received the same spate of humorous answers I’ve received for my first interview question:

“I use surrogate keys.” Personally, I prefer skeleton keys since they’re very versatile.

“I love star schemas.” Me too.  And Buckeye football.  And Chevrolet Corvettes.  And my kids.  But I digress… None of these are architectural philosophies.

Of course, the Underpants Gnomes hold sway in the mind of many people answering this question.  Recently I figured out why.  Underpants Gnomes wear cones on their heads.  Dogs hate cats.  Buckeyes and Wolverines are at perpetual enmity. Everybody knows that cones hate cubes, but they don’t have an annual contest to settle the score one November Saturday at noon (unless ABC says to play at 3:30 PM).  The Underpants Gnomes consider infiltration into the minds of aspiring Analysis Services candidates a dutiful service to Gnomekind!  Kill the cubes!

Has a candidate ever answered quickly and passionately with a wrong answer?  Only once.  We were interviewing a candidate from NYC who said he was an Analysis Services architect.  He was articulate, smooth, and eager.  He laid out some very good business problems he’d solved with Analysis Services, and we were feeling good since it’s hard to find experienced, thoughtful Analysis Services resources.  His answer to this question flabbergasted us.  He said, “Use as few dimensions as possible.”  Upon inquiry, he divulged his reasoning.  “Every time I add more dimensions to the cube, cube processing slows down to the point where I can’t process the cubes.”  This man was freely sacrificing the power of multidimensional databasing (dimensions) on the altar of performance because he couldn’t figure out how to tune the technology!

I told him that not only did the three people on our end disagree with him, but we had empirical evidence that he was wrong.  We had cubes with 50 or more dimensions in them, yet we could process them just as quickly as if they had only five dimensions.  Since he failed a high-level question, I immediately drilled into technical knowledge.  I asked him if he knew how to optimize a cube schema.  Oddly, he answered this question by discussing dimension optimization, a question that flunks all but the most knowledgeable of Analysis Services people.  He said that he could make dimension keys and names unique, which is great but doesn’t help you optimize a cube schema.

Incredibly, this promising guy had used Analysis Services for years but had never once bothered to learn about optimizing a cube schema.  I explained to him that when he adds dimensions to a cube, Analysis Services adds a join to the SQL statement it issues against the data provider.  (He said he’d never looked at the SQL statement, a sensible answer since he didn’t know he could do anything to change it.)  By optimizing the cube schema, you reduce or completely eliminate the joins in the cube processing statement.  Naturally, queries against a single table in the data source can have nearly instantaneous response times and marvelously high row payout rates, often choking either the RDBMS or Analysis Server.  You can theoretically add an infinite number of dimensions and not appreciably slow cube processing.  (Remember: I was interviewing for Analysis Services 2000 positions; the answer will vary somewhat for SSAS.)

He knew he was cooked.  You could hear it in his voice.  I think the full weight of his error hit him quickly.  He hadn’t just missed a prominent performance tuning feature in Analysis Services.  No, he’d been giving clients architectural advice based on this glaring error.  I told him that after he hangs up the phone, spending two hours with Books Online to read up on this feature would make a much better Analysis Services expert than he was before the phone call.

It was obvious that he was experienced, smart, and teachable.  What he didn’t know what that we asked the boss if we could bring him on anyway since we were pretty sure we could fill in his holes with little effort.  Sadly, his hourly rate was so high that the boss rejected him.  If you’re going to ask for a mountain of money you need to at least know as much (and hopefully more) than everybody else.  Dave Ramsey says, “In our economy, you get paid for what you know.”  I modify that to the following: “In our economy, you get paid more than the next guy depending on how much more you know than the next guy.”  We’re both right.

Study hard when mastering Analysis Services!  It pays!

Part IV of My Analysis Services Interview Questions: Technical Features

 

In parts two and three of this multi-part series on Analysis Services I discussed interview questions about cool business problems and architectural/design philosophies.  The third level of questioning I use deals with product features.  You’ll always get these questions if you fail one or both of the first two because even if you aren’t an architect, you could be very productive while working under an architect.  Plus, if you’re an ambitious learner, you’ll become an architect by learning from an architect.

Sometimes I even ask the technical questions if you pass the first two questions.  Remember, I’m not just looking for answers.  I’m also watching your thought process.  I want to see passion, and I’ve found the best people can hardly wait to tell you about the great things they’ve done, and when you ask them a question, the answer practically leaps out of them.  So if you answer the first two correctly, but I’m not convinced, you may still get the technical questions.

(Interesting story:  one fabulous lady I worked with was annoyed when we hired a guy who clearly knew what he was doing, but of whom I asked no technical questions.  She said she was bothered by this, and asked my why I’d done this.  I said, “Because he clearly knew what he was doing and I’m satisfied that he knew the technical details as well.” He was passionate and articulate.  But I told her that we’d ask technical questions in the next interviews if it made her comfortable, and we did that for two more interviews. But she changed her mind when she figured out that people who really know how to use Analysis Services also know its features.  She once noted that I didn’t hammer her in her phone interview.  I replied that I could tell she was thoughtful, passionate, and a great learner.  I wasn’t wrong.  To this day that lady is one of the very finest technologists I have ever met, and I would work with her anywhere, anytime.  Though I have left the client for whom she works, we still talk regularly.)

Even my technical questions are stratified.  My favorite technical question is “How do you optimize a dimension?”  (Remember, this question’s context was Analysis Services 2000.)

The correct answer has four parts:

(1)     If possible, ensure that member keys are unique across a given level, and

(2)     If possible, ensure that member keys are unique across the entire dimension.

(3)     If possible, ensure that member names are unique across a given level, and

(4)     If possible, ensure that member names are unique across a given dimension.

When this is achieved, you have very compact dimensions.  When answering queries, Analysis Services can avoid assigning context to members by using ancestral chains, and things get very quick and compact.  In my mind, this is the first technique for excellence in Analysis Services, and only two or three people have ever answered it correctly.  Usually the respondent, eager to get up after being knocked down by the two prior questions, blurts out, “I go to Tools and select Optimize Schema.”  They’re usually shocked when I inform them that this isn’t the correct answer because Optimize Schema optimizes a cube, and I asked about optimizing a dimension.  Honestly, I’ve never had a candidate answer the question correctly if this is their first answer.

My second favorite question is about optimizing the cube schema.  Nearly all candidates know what this feature is and have used it.  I avoid closed-ended questions like “Have you ever optimized a cube schema?” because the answer is a simple, “Yes.”  This doesn’t tell me what you know, so I always ask what optimizing a cube schema does.  Ironically, the same people who use schema optimization to answer the dimension optimization question don’t know what schema optimization actually does, or why it’s used.  The answer is that it reduces or eliminates joins in the cube processing SQL statement.  It may be the most powerful technique to improve cube processing performance (assuming ample hardware and parallel partition processing that doesn’t swamp the server, and moderate aggregations).

One candidate, who took a beating on the two high level questions, also took a beating on these two technical questions.  He was obviously unaccustomed to this because after a half hour he said, “Well, maybe you just haven’t actually done this.”  This new tack caught me a little off guard since I had explained to him he couldn’t articulate any cool business problems, he didn’t have any strong architectural philosophies with which I agreed (and I told him why), and he didn’t know how to optimize a dimension.  I even told him *why* these things were important and what the right answers were.  In addition to getting the answers wrong, he demonstrated that he wasn’t teachable, a bad trait for any technologist since we must learn continuously.  He was notable not only because of his belligerence, but also because he only had one name on his resume.  He’d concatenated his first and last name into a single word, which had eight syllables and was in a foreign tongue.  He left quite an impression, and six months later when his resume landed on our desks from a different staffing company, we immediately put it into the circular file.  Ignorance, belligerence, and a goofy name are a bad combination.

 


Part V of My Analysis Services Interview Questions: The Most Common MDX Functions

I have a litmus test for true multidimensional expertise:  MDX.  I’ve found MDX to be so potent that I freely admit my own inadequacies.  I study it, I use it, and I consult with others who do the same, but I know there’s more.   If somebody says they know Analysis Services, I want to see their MDX.  Show me somebody who doesn’t know MDX, and I’ll show you a pretender.

Remember, I don’t ask closed-ended questions.  I’ve heard interviewers say, “Do you know MDX?”  Of course, the answer is always “Yes” but this answer tells you nothing of what the candidate actually knows.  I have two favorite MDX questions.

Question 1:  What MDX functions do you most commonly use?

This is a great question because you only know this answer by experience.  If you ask me this question, the answer practically rushes out of me.  “CrossJoin, Descendants, and NonEmpty, in addition to Sum, Count, and Aggregate.  My personal favorite is CrossJoin because it allows me identify non-contiguous slices of the cube and aggregate even though those cube cells don’t roll up to a natural ancestor.”  Indeed, CrossJoin has easily been my bread and butter.

If you stammer and stutter, you’re in trouble.  If the first word out of your mouth is “sum” without any explanation of how you use it, you’re in trouble. 

Question 2:  Where do you put calculated members?

The reflexive answer is “in the Measures dimension” but this is the obvious answer.  So I always follow up with another question.  “If you want to create a calculated member that intersects all measures, where do you put it?”  A high percentage of candidates can’t answer this question, and the answer is “In a dimension other than Measures.”  If they can answer it, I immediately ask them why.  The answer is “Because a member in a dimension cannot intersect its own relatives in that dimension.”

Then I ask a real toughy.  “Where would you put a calculated member if you wanted it to intersect all other dimensions?” You can hear the wheels turning with this question.  You can’t put it in the Measures dimension because it wouldn’t intersect the other measures.  You can’t put it in, say, the Time dimension because it wouldn’t intersect the other Time members.

The answer is simple but not obvious unless you’re very creative or somebody told you the answer. (Somebody told me the answer when I first started learning Analysis Services.)  You create a dummy dimension solely for the purpose of holding those calculated members!  My friends and I refer to this dimension as a “hanger dimension” because the only function it serves is to “hang” calculated members.  Any member placed therein will intersect all other dimensions.

In fact, you can use multiple hangers.  This is particularly handy if you have calculated members that need to intersect all other user-defined dimensions, Measures, *and* another hanger dimension.

In an earlier post I mentioned a fabulous lady who has taught me many things.  She wasn’t able to answer this particular question.  When I told her the answer, she immediately understood it.  It didn’t affect my recommendation to the boss because she was so good.  You’ll never believe what she told me when she arrived for her first day of work.  She said, “Do you remember the question you asked me about the hanger dimension?  After I hung up the phone, I pulled up Analysis Manager and tried it.  It worked really well.”  Wow!  No wonder we like her so much!  She even learns during interviews!

 



Original broken links for reference:
 
Comments (24)
24 Thursday, 05 August 2010 16:21
Nathan Proper
It's funny how people who don't know how to answer the questions assume the questioner is an arrogant racist. Did you not notice that he repeatedly said that candidates who couldn't answer the first couple questions still had potential as engineers, just not architects. There is a difference. Just because you can drive a car and even change the oil doesn't mean you know how to build one. So if your resume claims you're an architect, there's nothing wrong with making you prove it by answering difficult questions. Again - as he pointed out time and again - it's less about the "right" answer and more about the thought process behind it. If you can't handle thinking during an interview, McDonald's is probably hiring...
23 Friday, 30 July 2010 05:55
Goocha
First time I have come across an article written by a computer.

Computer can be fooled, he is a fool, racist.
22 Tuesday, 20 July 2010 07:12
Superman
Thank you, I have recently joined a new group in a government body and they want to do "BI". I have been working with the SQL Server stack for over 10 years and still learning. He is bringing me candidates with the skill set "Sql expericence = medium" so I sent him your top level question to use to screen them in.
21 Thursday, 08 July 2010 17:49
tempo
45 dimensions per cube? obviously this guy needs to refine his design philosophy first before interviewing any more potential candidates. I know this kind of "know-all" kind of guys who sits on the interview board and scare already stressed out candidates in the interview environments.

Good luck to those who happens to have a "massive" bad luck of working with him.
20 Saturday, 22 May 2010 05:24
Sally
The author seems to be know the subject which is fair enough BUT its his attitude that would bother anybody wanting to hire this guy or his consultanny he works for. I think he seems to have not resolved some personal issues which he seems to take out on his potential employees. I have quite a few years on SSAS but would never hire someone with his attitude or work for someone like him. I see a potential disaster on the client side with this guy unless he changes his attitude.
19 Friday, 21 May 2010 18:05
uberdbdude
Hmm! Ok, some of your interview questions are alright. But, from your particular manner of presenting your philosophy of interviewing I really doubt if you'd pass muster for one of my interviews. In fact, since you seem to be a consultant I have put you on my 'skip list' should my company be looking.
18 Thursday, 20 May 2010 21:32
anon
Yes it seems like this guy will take someone who is maybe 95% there and lump them in the "doesn't know anything" category. Here's my prediction -- he's interviewed 100's of people but has never once found the right person. They may have had the right answer, but because they could not explain it exactly the way he wanted, then "they didn't know anything."

Lastly the Microsoft BI stack consists of more than SSAS. So unless you are on a big team where you are looking for someone who will do nothing other than SSAS developmet (e.g. they are not going to do the SQL work or SSIS work...) then this will potentially remove too many candidates from the selection.

Quite frankly, he could easily be fooled. It looks like you just memorize the answers to questions on the cert test and you are fine with this interview. Of course this does not necessarily net a person that can actually think and produce a business solution.


ajax rsizable panel c#

clock July 31, 2010 01:31 by author Admin

First header

First content

Second header

Second content

qqqqqqqqqqq.txt (1.24 kb)



ajax resizable panel C#

clock July 31, 2010 01:30 by author Admin

First header

First content

Second header

Second content


add new partition SSAS

clock July 14, 2010 21:02 by author Admin
Most partitions for a single Measure Group should be based on the same view and broken out by year.
 
1) In order to add the first yearly partition you need to delete the default partition created when you initially created the measure group.
 
2) Click 'New Partition', in the Partition Wizard - 'Specify Source Information' screen select the available table and click next.
 
3) In the Partition Wizard - 'Specify Source Information' screen (below) select 'Specify a query to restrict rows' and then enter a 'select * from 'YourFactView' where monthYear = YYYY' type query. Click Next
 
PartitionWizard_RestrictRows.jpg
 
4) In the Partition Wizard - 'Processing and Storage Locations' screen go witht the defaults and click next.
5) In the Partition Wizard - 'Completing the Wizard' screen assign the correct name (using a XXXXXXX_YYYY format), select  the 'Design aggreagations for the partition now' option and click next.
6) Click next on the 'Welcome to the Aggregation Design Wizard' screen.
7) Click next on the Aggregation Design Wizard - 'Specify Storage and Caching Options' screen.
8) Click the Count button in the Aggregation Design Wizard - 'Specify Object Counts' screen. Afer the count is completed click the Next button.
9) In the Aggregation Design Wizard - 'Set Aggregation Options' screen select the 'Performance gain reaches' option and leave the setting @ 30%. Click the Start button. Watch the graph grow, either Stop when the graph flattens or let the process run till it reaches 30%. This is only the initial aggregations, once the measure group has ran for a while these will be redone. Click the Next button when finished building the aggregations.

Last modified at 17/05/2010 3:13 PM  by Anthony McNameeNo presence information 


SQL 2005 Disaster Recovery Scripts: To Setup and Remove Mirroring

clock June 7, 2010 19:24 by author Admin
SQL 2005 includes a very easy to use method to setup and remove mirroring using SQL Server Management Studio however, when you are dealing with 30+ databases spread across multiple instances, it becomes tedious to use GUI to setup and remove mirroring. I have included two scripts that may assist you in performing the task using T-SQL as follows: 1. Setup mirroring: This script generates nine (9) distinct steps using information you provide for a single database - these steps can then be used to save scripts to quickly setup mirroring logic for each database. I have tried to include detailed comments on what each step does and section on what information you need to have available to generate the final script (see script for further details). Note: I have not included piece for Witness. 2. Remove mirroring: This script generates manual failover script for all databases per server \ instance that are mirrored. Note: This does not actually do failover but just generates the script that you can then use to do failover. Hope this helps you in your day-to-day job function or during disaster recovery exercise. Would love to hear of suggestions to improve these. Thanks,

    /********************************************************************************************
    *    Purpose:        Generate steps to setup mirroring in SQL 2005                            *
    *    Author:            Farhan F. Sabzaali                                                        *
    *    Date Created:    March 17, 2009                                                            *    
    *    Last Updated:                                                                            *    
    *    Updates Made:                                                                            *
    *    Instructions:    1. Modify section marked "CHANGE ME"                                     *
    *                    2. Run Query - Change Result to Text as output                            *
    *                    3. Execute one step at a time                                            *
    *********************************************************************************************/
    
    DECLARE     @DatabaseName            NVARCHAR (250)
            ,@Principal                NVARCHAR (1000)
            ,@Mirror                NVARCHAR (1000)
            ,@DomainAccount            NVARCHAR (500)
            ,@DefaultLanguage        NVARCHAR (250)
            ,@ListenerPort            NVARCHAR (50)
            ,@PrincipalBackupFolder    NVARCHAR (1000)
            ,@DatabaseDataFileName    NVARCHAR (250)
            ,@MirrorDataPath        NVARCHAR (1000)
            ,@MirrorLogPath            NVARCHAR (1000)
            ,@PrincipalTCP            NVARCHAR (1000)
            ,@MirrorTCP                NVARCHAR (1000)        
            ,@MirrorSafety            NVARCHAR (50)
            ,@MirrorMonitorJobName    NVARCHAR (250)
            ,@DBDeleteDate            NVARCHAR (100)
            ,@Step1                    NVARCHAR (MAX)
            ,@Step2                    NVARCHAR (MAX)
            ,@Step3                    NVARCHAR (MAX)
            ,@Step4                    NVARCHAR (MAX)
            ,@Step5                    NVARCHAR (MAX)
            ,@Step6                    NVARCHAR (MAX)
            ,@Step7                    NVARCHAR (MAX)
            ,@Step8                    NVARCHAR (MAX)
            ,@Step9                    NVARCHAR (MAX)        
            ,@AllSteps                NVARCHAR (MAX)
            ,@NewLine                CHAR (1)
            ,@APOS                    CHAR (1)

    /* CHANGE ME START */
    SET        @DatabaseName = 'AdventureWorks'                                                --Database Name
    SET        @DatabaseDataFileName = 'AdventureWorks_Data'                                --Logical Name of Data File
    SET        @Principal = 'SERVERA\INSTANCE1'                                    --Principal Server \ Instance Name
    SET        @Mirror = 'SERVERB\INSTANCE1'                                        --Mirror Server \ Instance Name
    SET        @DomainAccount = 'Domain\SQLAGENT'                                    --Domain Account used to setup mirror - should have access on both
    SET        @DefaultLanguage = '[us_english]'                                        --Default Language 
    SET        @ListenerPort = '5512'                                                    --Default TCP \ Mirror Listening Port
    SET        @PrincipalBackupFolder = '\\ServerA\Backup_Instance1\'                --Shared Drive on Principal that can be accessed by Mirror
    SET        @MirrorDataPath = 'M:\Microsoft SQL Server\MSSQL.1\MSSQL\DATA\'            --Folder on Mirror where Data files are saved
    SET        @MirrorLogPath = 'N:\Microsoft SQL Server\MSSQL.1\MSSQL\LOGS\'            --Folder on Mirror where Log files are saved
    SET        @PrincipalTCP = 'TCP://ServerA.corporate.domain.com:5512'        --Principal TCP Configuration - Refer to BOL for details
    SET        @MirrorTCP = 'TCP://ServerB.corporate.domain.com:5512'            --Mirror TCP Configuration - Refer to BOL for details
    SET        @MirrorSafety = 'FULL'                                                    --Mirror Safety - FULL OR OFF                        
    /* CHANGE ME FINISH */    
    
    SET        @MirrorMonitorJobName = 'Database Mirroring Monitor Job'
    SET        @NewLine = CHAR(13)
    SET        @APOS = CHAR (39)
    SET        @DBDeleteDate = CONVERT(NVARCHAR (100), GETDATE() - 1, 101)
    SET        @Step1 = ''
    SET        @Step2 = ''
    SET        @Step3 = ''
    SET        @Step4 = ''
    SET        @Step5 = ''
    SET        @Step6 = ''
    SET        @Step7 = ''
    SET        @Step8 = ''
    SET        @Step9 = ''
    SET        @AllSteps = ''
    
    SET        @Step1 = @Step1 + '--SQL Server Mirror Setup Script - Step 1: Add Login For EndPoint'
    SET        @Step1 = @Step1 + @NewLine + '--Run On Principal Server'
    SET        @Step1 = @Step1 + @NewLine + @NewLine + '--Step 1 Start'
    SET        @Step1 = @Step1 + @NewLine + '
        USE [Master];
        GO
        
        IF NOT EXISTS (SELECT * FROM sys.server_principals WHERE name=' + @APOS + @DomainAccount + @APOS + ')
        BEGIN
         CREATE LOGIN [' + @DomainAccount + '] FROM WINDOWS WITH DEFAULT_DATABASE=[master],DEFAULT_LANGUAGE=' + @DefaultLanguage + ';
         PRINT ' + @APOS + 'Create Windows login [' + @DomainAccount + '].' + @APOS + '
        END
        GO

        IF NOT EXISTS (SELECT * FROM sys.database_mirroring_endpoints WHERE name=' + @APOS + 'MirrorEndpoint' + @APOS + ')
        BEGIN
         CREATE ENDPOINT MirrorEndpoint AUTHORIZATION [' + @DomainAccount + ']
            STATE=STARTED 
            AS TCP
            (
             LISTENER_PORT=' + @ListenerPort + ',
             LISTENER_IP = ALL
            )
            FOR DATABASE_MIRRORING 
            (
             AUTHENTICATION = WINDOWS NEGOTIATE,
             ENCRYPTION = SUPPORTED,
             ROLE=ALL
            );
         PRINT ' + @APOS + 'Create mirroring endpoint [MirrorEndpoint].' + @APOS + '
        END
        GO'

    SET        @Step1 = @Step1 + @NewLine + @NewLine + '--Step 1 Finish'

    SET        @Step2 = @Step2 + '--SQL Server Mirror Setup Script - Step 2: Check Database Recovery Mode'
    SET        @Step2 = @Step2 + @NewLine + '--Run On Principal Server'
    SET        @Step2 = @Step2 + @NewLine + @NewLine + '--Step 2 Start'
    SET        @Step2 = @Step2 + @NewLine + '
        USE [Master];
        GO
        
        IF NOT EXISTS (SELECT * FROM sys.databases WHERE name = ' + @APOS + @DatabaseName + @APOS + ' AND recovery_model = 1)
        BEGIN
         ALTER DATABASE [' + @DatabaseName + '] SET RECOVERY FULL;
         PRINT ' + @APOS + 'Set full recovery for database [' + @DatabaseName + '].' + @APOS + '
        END
        GO'
        
    SET        @Step2 = @Step2 + @NewLine + @NewLine + '--Step 2 Finish'

    SET        @Step3 = @Step3 + '--SQL Server Mirror Setup Script - Step 3: Backup Database'
    SET        @Step3 = @Step3 + @NewLine + '--Run On Principal Server'
    SET        @Step3 = @Step3 + @NewLine + @NewLine + '--Step 3 Start'
    SET        @Step3 = @Step3 + @NewLine + '
        USE [Master];
        GO

        BACKUP DATABASE [' + @DatabaseName + '] TO DISK = ' + @APOS + @PrincipalBackupFolder + @DatabaseName + '_Step3_Principal.bak' + @APOS + ' WITH DESCRIPTION = ' + @APOS + 'Back up ' + @DatabaseName + ' database Full' + @APOS + ', FORMAT, INIT, NAME = ' + @APOS + @DatabaseName +'_Step3_Principal.bak' + @APOS + ', SKIP, REWIND, NOUNLOAD, STATS = 10
        GO
        BACKUP LOG [' + @DatabaseName + '] TO DISK = ' + @APOS + @PrincipalBackupFolder + @DatabaseName + '_Step3_Principal.trn' + @APOS + ' WITH DESCRIPTION = ' + @APOS + 'Back up ' + @DatabaseName + ' database Log' + @APOS + ', FORMAT, INIT, NAME = ' + @APOS + @DatabaseName +'_Step3_Principal.trn' + @APOS + ', SKIP, REWIND, NOUNLOAD, STATS = 10
        GO'
        
    SET        @Step3 = @Step3 + @NewLine + @NewLine + '--Step 3 Finish'

    SET        @Step4 = @Step4 + '--SQL Server Mirror Setup Script - Step 4: Restore Database'
    SET        @Step4 = @Step4 + @NewLine + '--Run On Mirror Server'
    SET        @Step4 = @Step4 + @NewLine + @NewLine + '--Step 4 Start'
    SET        @Step4 = @Step4 + @NewLine + '
        USE [Master];
        GO

        RESTORE DATABASE [' + @DatabaseName + '] FROM DISK = ' + @APOS + @PrincipalBackupFolder + @DatabaseName + '_Step3_Principal.bak' + @APOS + ' WITH MOVE ' + @APOS + @DatabaseDataFileName + @APOS + ' TO ' + @APOS + @MirrorDataPath + @Databasename + '.mdf' + @APOS + ', MOVE ' + @APOS + @DatabaseName + '_Log' + @APOS + ' TO ' + @APOS + @MirrorLogPath + @DatabaseName + '.ldf' + @APOS + ', NORECOVERY, NOUNLOAD, REPLACE, STATS = 10
        GO
        RESTORE LOG [' + @DatabaseName + '] FROM DISK = ' + @APOS + @PrincipalBackupFolder + @DatabaseName + '_Step3_Principal.trn' + @APOS + ' WITH NORECOVERY, NOUNLOAD, STATS = 10
        GO'
        
    SET        @Step4 = @Step4 + @NewLine + @NewLine + '--Step 4 Finish'

    SET        @Step5 = @Step5 + '--SQL Server Mirror Setup Script - Step 5: Setup Partner Information On Mirror'
    SET        @Step5 = @Step5 + @NewLine + '--Run On Mirror Server'
    SET        @Step5 = @Step5 + @NewLine + @NewLine + '--Step 5 Start'
    SET        @Step5 = @Step5 + @NewLine + '
        USE [Master];
        GO

        ALTER DATABASE ' + @DatabaseName + '
         SET PARTNER = ' + @APOS + @PrincipalTCP + @APOS + '
        GO'
        
    SET        @Step5 = @Step5 + @NewLine + @NewLine + '--Step 5 Finish'

    SET        @Step6 = @Step6 + '--SQL Server Mirror Setup Script - Step 6: Setup Partner Information On Principal'
    SET        @Step6 = @Step6 + @NewLine + '--Run On Principal Server'
    SET        @Step6 = @Step6 + @NewLine + @NewLine + '--Step 6 Start'
    SET        @Step6 = @Step6 + @NewLine + '
        USE [Master];
        GO

        ALTER DATABASE ' + @DatabaseName + '
         SET PARTNER = ' + @APOS + @MirrorTCP + @APOS + '
        GO

        ALTER DATABASE ' + @DatabaseName + ' SET SAFETY ' + @MirrorSafety + ';
        GO

        USE [msdb];
        GO

        IF EXISTS (SELECT * FROM dbo.sysjobs WHERE name LIKE ' + @APOS + @MirrorMonitorJobName + @APOS + ')
        BEGIN
         PRINT ' + @APOS + @MirrorMonitorJobName + ' already exists on server instance [' + @Principal + '].' + @APOS + ';
        END
        GO

        IF NOT EXISTS (SELECT * FROM dbo.sysjobs WHERE name LIKE ' + @APOS + @MirrorMonitorJobName + @APOS + ')
        BEGIN
         EXECUTE sp_dbmmonitoraddmonitoring;
         PRINT ' + @APOS + @MirrorMonitorJobName + ' has been added to server instance [' + @Principal + '].' + @APOS + ';
        END
        GO'    
        
    SET        @Step6 = @Step6 + @NewLine + @NewLine + '--Step 6 Finish'

    SET        @Step7 = @Step7 + '--SQL Server Mirror Setup Script - Step 7: Verify Principal Setup'
    SET        @Step7 = @Step7 + @NewLine + '--Run On Principal Server'
    SET        @Step7 = @Step7 + @NewLine + @NewLine + '--Step 7 Start'
    SET        @Step7 = @Step7 + @NewLine + '
        USE [Master];
        GO

        IF NOT EXISTS (SELECT * FROM sys.server_principals WHERE name=' + @APOS + @DomainAccount + @APOS + ')
         PRINT ' + @APOS + 'Windows Login [' + @DomainAccount + '] does not exist.' + @APOS + '
        GO

        IF NOT EXISTS (SELECT * FROM sys.database_mirroring_endpoints WHERE name=N''MirrorEndpoint'' AND type=4)
         PRINT ' + @APOS + 'Mirroring endpoint [MirrorEndpoint] does not exist.' + @APOS + '
        GO

        IF NOT EXISTS (SELECT * FROM sys.database_mirroring A INNER JOIN sys.databases B ON A.database_id = B.database_id WHERE B.name=' + @APOS + @DatabaseName + @APOS + ' AND A.mirroring_state=4 AND A.mirroring_role=1 AND A.mirroring_partner_instance=' + @APOS + @Mirror + @APOS + ')
         PRINT ' + @APOS + 'Principal Database [' + @DatabaseName + '] does not exist or may not be synchronized or in principal role.' + @APOS + '
        GO'
        
    SET        @Step7 = @Step7 + @NewLine + @NewLine + '--Step 7 Finish'

    SET        @Step8 = @Step8 + '--SQL Server Mirror Setup Script - Step 8: Verify Mirror Setup'
    SET        @Step8 = @Step8 + @NewLine + '--Run On Mirror Server'
    SET        @Step8 = @Step8 + @NewLine + @NewLine + '--Step 8 Start'
    SET        @Step8 = @Step8 + @NewLine + '
        USE [Master];
        GO

        IF NOT EXISTS (SELECT * FROM sys.server_principals WHERE name=' + @APOS + @DomainAccount + @APOS + ')
         PRINT ' + @APOS + 'Windows Login [' + @DomainAccount + '] does not exist.' + @APOS + '
        GO

        IF NOT EXISTS (SELECT * FROM sys.database_mirroring_endpoints WHERE name=N''MirrorEndpoint'' AND type=4)
         PRINT ' + @APOS + 'Mirroring endpoint [MirrorEndpoint] does not exist.' + @APOS + '
        GO

        IF NOT EXISTS (SELECT * FROM sys.database_mirroring A INNER JOIN sys.databases B ON A.database_id = B.database_id WHERE B.name=' + @APOS + @DatabaseName + @APOS + ' AND A.mirroring_state=4 AND A.mirroring_role=2 AND A.mirroring_partner_instance=' + @APOS + @Principal + @APOS + ')
         PRINT ' + @APOS + 'Principal Database [' + @DatabaseName + '] does not exist or may not be synchronized or in principal role.' + @APOS + '
        GO'
            
    SET        @Step8 = @Step8 + @NewLine + @NewLine + '--Step 8 Finish'

    SET        @Step9 = @Step9 + '--SQL Server Mirror Setup Script - Step 9: Delete Backup Files'
    SET        @Step9 = @Step9 + @NewLine + '--Run On Principal Server'
    SET        @Step9 = @Step9 + @NewLine + @NewLine + '--Step 9 Start'
    SET        @Step9 = @Step9 + @NewLine + '
        USE [Master];
        GO
        
        EXECUTE xp_delete_file
            0,
            ' + @APOS + @PrincipalBackupFolder + @DatabaseName + '_Step3_Principal.bak' + @APOS + ',
            ' + @APOS + 'bak' + @APOS + ',
            ' + @APOS + @DBDeleteDate + @APOS + ',
            0
        GO

        EXECUTE xp_delete_file
            0,
            ' + @APOS + @PrincipalBackupFolder + @DatabaseName + '_Step3_Principal.trn' + @APOS + ',
            ' + @APOS + 'trn' + @APOS + ',
            ' + @APOS + @DBDeleteDate + @APOS + ',
            0
        GO'
        
    SET        @Step9 = @Step9 + @NewLine + @NewLine + '--Step 9 Finish'

    SET        @AllSteps = @Step1 + @NewLine + @NewLine + @Step2 + @NewLine + @NewLine + @Step3 + @NewLine + @NewLine + 
                        @Step4 + @NewLine + @NewLine + @Step5 + @NewLine + @NewLine + @Step6 + @NewLine + @NewLine + 
                        @Step7 + @NewLine + @NewLine + @Step8 + @NewLine + @NewLine + @Step9 + @NewLine + @NewLine

    SET NOCOUNT ON
    
    SELECT    @AllSteps
    
    SET NOCOUNT OFF

/* START OF SECOND SCRIPT */

    /********************************************************************************************
    *    Purpose:        Generate script to remove mirroring in SQL 2005                            *
    *    Author:            Farhan F. Sabzaali                                                        *
    *    Date Created:    March 17, 2009                                                            *    
    *    Last Updated:                                                                            *    
    *    Updates Made:                                                                            *
    *    Instructions:    1. Run script at server \ instance to identify all mirrored database    *
    *********************************************************************************************/
    
    DECLARE     @DatabaseName            NVARCHAR (250)
            ,@SQL                    NVARCHAR (MAX)
            ,@NewLine                CHAR (1)
            ,@APOS                    CHAR (1)                
            
    SET        @NewLine = CHAR(13)
    SET        @APOS = CHAR (39)            
    SET        @SQL = ''
    
    DECLARE     CurDatabase

    CURSOR FOR
        SELECT    B.Name
        FROM    sys.database_mirroring A 
                INNER JOIN 
                sys.databases B 
                ON 
                A.database_id = B.database_id
                AND
                A.Mirroring_Guid IS NOT NULL

    OPEN CurDatabase

    FETCH NEXT FROM CurDatabase INTO @DatabaseName
    
    WHILE @@FETCH_STATUS = 0
    
        BEGIN
        
            SET    @SQL = @SQL + '
            ALTER DATABASE ' + @DatabaseName + ' SET PARTNER OFF;
            GO
            RESTORE DATABASE ' + @DatabaseName + ' WITH RECOVERY;
            GO'
         
            SET    @SQL = @SQL + @NEWLINE                            

    FETCH NEXT FROM CurDatabase INTO @DatabaseName
    
    END

    CLOSE CurDatabase

    DEALLOCATE CurDatabase
    
    SET    NOCOUNT ON
    
    SELECT    @SQL
    
    SET NOCOUNT OFF










Calendar

<<  January 2018  >>
MoTuWeThFrSaSu
25262728293031
1234567
891011121314
15161718192021
22232425262728
2930311234

View posts in large calendar

About the author

Merdad Aboudizadeh is the architect and creator of office exchange software

Sign in