|
Yes, very true.
Use the best guess
|
|
|
|
|
I'm studying SSRS and due to situations have limited time to study. Please suggest what all topics i should be focusing on so as to crack an interview & also utilize it in new job.
Thanks.
PS: If my question seems too generic, still answer as per u're understanding.
|
|
|
|
|
See here[^]; although you will be lucky to pass an interview without some practical experience.
Use the best guess
|
|
|
|
|
how two phase commit is handled in sql server ?
and what is the difference between two phase commit in sql and Oracle??
please help me
|
|
|
|
|
You need some google foo[^]. I will be interested to see if you get anyone experienced with this solution responding!
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
hello two newbie questions for OLAP. I'm looking at my cube from Visual Studio ...
(a) Edit Dimension
I want to basically add a new Dimension level - under Design view of "Dimension Structure" ...
I can see Data Source View tabs containing two existing tables which make up my original dimension. Trouble is, how can I add the third? Can I add a table for this dimension?
(b) Delete Partitiion
I can find from under Cube's Design view there's one tab "Partitions" - and I can delete partition manually one-by-one right click + select "Delete" from Context Menu. Is there a quicker way of doing this?
Thanks
dev
|
|
|
|
|
Hi guys,
I have 2 tables Table1,Table2
Table1 having products and serial number information, and Table2 having set of 32 check points to say that product having defect or not, i want to get only total number of non defected products for each month,
say example i have 3 products in march month and only 2 products satisfy set of 32 check points and having non defect, so for this problem result will be Total Number of non defected product will be 2. i want this in query.
I tried it but it giving each product defect count, i want only total number of non defect product
here is my query
WITH months (num) as(
SELECT 1
UNION ALL
SELECT num+1
FROM months
WHERE num<12)SELECT m.num as Month,
isnull(t.resolved,0) as Totalresolved,
t.YearValue as YearData FROM months m
left join (SELECT month(ProgramStartedDate) as Month,YEAR(ProgramStartedDate) as YearValue,
sum(case when (ProgrammerStatus = 1 OR ReviewerStatus = 1) THEN 1 ELSE 0 END)as resolved
FROM tb_ProgramCheckList
LEFT JOIN dbo.tb_ProgramDetails on
tb_ProgramDetails.ProgramID =tb_ProgramCheckList.ProgramID AND tb_ProgramDetails.Version=tb_ProgramCheckList.VersionID
WHERE ProgramStartedDate >='10/1/2012'
and ProgramStartedDate <'10/01/2013'
GROUP BY month(ProgramStartedDate),YEAR(ProgramStartedDate)
If any body help me to solve this issue will be appreciated,
Thanks and regards
VISHWA
|
|
|
|
|
Does table 2 have 32 bit columns or two columns with one record per check?
|
|
|
|
|
each program having 32 bit checks to prove the item is defect or not
|
|
|
|
|
Great design - what happens when you want to add a new check!
You need to add the values of the 32 checks (presumably booleans where 1= positive) if it = 32 then the program passes andything els the program fails.
Select *
From (Select C1 + C2+C3+... as checktotal, Program from table) as SubQuery
where checktotal = 32
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
We have a production table with a column called sheets.
When we are making the final product in our production, sheets will be stored in another table called control_measures.
This means we will only store sheets in production when we are making the final product.
I want to know what you guys think.
Should I normalize and move sheets to a new table related to production or should i make it nullable as it is?
I know the recommended way of doing it in relational databases is normalization.
But still, I´m curious to know what you guys have to say about this.
|
|
|
|
|
If you were creating the tables at the moment I would certainly normalize it, that makes it easier to make changes in the future.
If this is the current design and it works fine, I wouldn't touch it.
"The ones who care enough to do it right care too much to compromise."
Matthew Faithfull
|
|
|
|
|
So you prefer to normalize even if it is just one column?
Whats wrong with a nullable column?
And yes, this is the current design, i´m expanding.
|
|
|
|
|
Söderlund wrote: So you prefer to normalize even if it is just one column?
Whats wrong with a nullable column?
A nullable column is not dependent on the key it's linked to, while every atomic fact in the record should depend on the key. One can split the field of to it's own table with it's own identifying key. That's theoretically beautiful.
If you were to implement the "beautiful" method, you'd end up with an extra table and an extra join.
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
|
|
|
|
|
The proper answer you got from Eddy.
Mine is that it depends, I'd rather normalize once to many than once to few.
As I don't know anything about your domain, I also don't know if there are changes to your database to expect.
But it's also about performance, most of the time (not always) normalization boosts performance in contrary to popular belief. The most obvious exception is OLAP.
Here's an excellent article[^] on that subject.
Whether or not nulls is a performance hit or not also depends on what database you're using, Oracle for example isn't ISO compliant in this matter and doesn't store NULL values at all, the lack of a value is the NULL value.
SQL Server on the other hand stores a NULL token that takes two bytes for variable length data and the full space for fixed length data. So if you have a column with a high percentage of nulls you get a hit on memory compared to a separate table. If there's a low percentage of nulls you can keep it in the original table.
And then again, SQL Server nowadays have SPARSE Columns. Can't say for sure how well it works as I have never used them. But at least in theory they should have fixed the problem, but give you an extra join for the null bitmap.
"The ones who care enough to do it right care too much to compromise."
Matthew Faithfull
|
|
|
|
|
I´m kind of doing science here.
No one has done this kind of system before and the company i´m working for have been turned down by programming firms doing production systems.
To prepare database against future changes i would have to normalize every column since no one knows what or how things works until we have tested it.
neither memory nor performance is an issue since we are talking about 6-10k records a year.
|
|
|
|
|
Söderlund wrote: To prepare database against future changes i would have to normalize every
column
Within reason, If you know your domain you can make a qualified guess as to were there will be changes.
Remember that there is a disadvantage with normalizing all the way. Your CRUD operations will become complicated.
There's a saying: Normalize 'til it hurts, denormalize 'til it works.
"The ones who care enough to do it right care too much to compromise."
Matthew Faithfull
|
|
|
|
|
Indeed, I can guess and i have.
However, I could not foresee this change.
I´m split because on one hand i have "normalization is the way to go"
and on the other hand i have "code that works".
At the moment i will keep a nullable column and when it works i will look at normalization and how that will affect current code.
I also believe it will be easier for future coders if i follow the standard guidelines.
|
|
|
|
|
"Code that works" always trumps change "because it's the correct way of doing it".Söderlund wrote: if i follow the standard guidelines
Whos guidelines are those?
Make your own guidelines instead, they're easier to follow
"The ones who care enough to do it right care too much to compromise."
Matthew Faithfull
|
|
|
|
|
Jörgen Andersson wrote: Whos guidelines are those?
That´s what a friend was fed from school (I´m not schooled).
So I Assumed it was standard, mostly because it makes sense.
Not that i trust the school since they had a web developer program with a C# winform ball game exam and didnt touch php at all.
|
|
|
|
|
I would trust the school a lot less if they taught PHP.
Schools shouldn't teach languages, they should teach programming.
C# in contrary to PHP enforces a lot of good habits.
Not that you can't program properly in PHP, you certainly can.
But this is a subject that others are much better at answering then I am.
The best place to ask about this subject is probably the Lounge, but make damn sure it's not formed as a programming question but rather a discussion subject, or you might be well fried.
"The ones who care enough to do it right care too much to compromise."
Matthew Faithfull
|
|
|
|
|
I´m not gonna start a programming language war.
My point was they shouldnt call it a web development course if they will spend 80% of the time making offline C# and java applications.
It should be called "dip your toes into the programming water" course.
|
|
|
|
|
Jörgen Andersson wrote: Normalize 'til it hurts, denormalize 'til it works It already hurts to read that. Normalize to 3NF, or better yet, BCNF. Denormalization should only be done when one can explain the trade-offs made, and the advantage gained.
Jörgen Andersson wrote: Your CRUD operations will become complicated. Only if you take a religious stance on optional fields. The other "recommendations" wouldn't impact the typical data-operations, nor complicate your queries.
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
|
|
|
|
|
Its not something I'm following, I prefer to get it as right as possible in the first go. I completely agree with you.
It's just something I added to tease someone.
With complicated I'm referring to that you get more to do the more tables you have, and the more tables with relations you have the more you have to do things in the right order.
Each little operation is simple.
"The ones who care enough to do it right care too much to compromise."
Matthew Faithfull
|
|
|
|
|
Jörgen Andersson wrote: It's just something I added to tease someone.
Jörgen Andersson wrote: Each little operation is simple. Can't argue with that.
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
|
|
|
|