|
Well thanks to our wonderful DBA for the last couple of months we have been having weird timeout issues on our primary Data warehouse SQL Server. After picking through the server setup I found he had turned on the ‘Boost SQL Server Priority’. This is a really large box, and it only serves as a dB server, so there is really no need to monkey with that setting at all. I changed it back to defaults, restarted SQL, we haven’t had an issue since then…
Executed as user: {username}. TCP Provider: The semaphore timeout period has expired. [SQLSTATE 08S01] (Error 121) Communication link failure [SQLSTATE 08S01] (Error 121). The step failed.
Common sense is admitting there is cause and effect and that you can exert some control over what you understand.
|
|
|
|
|
S Douglas wrote: I changed it back to defaults, restarted SQL, we haven’t had an issue since
then…
It's now been several month's since I made this change, and NOT ONCE have I had issues of this nature.
Common sense is admitting there is cause and effect and that you can exert some control over what you understand.
|
|
|
|
|
Bitten again by the same issue from March last year (see my blog below). This time I spent an entire day trying to figure out how best to deal with the issue. Of coarse these days I have backups of the cubes (SQL Server 2005 Analysis Services) however, I have never tried restoring from those backups. I do have a project buried on my plate to try a full restore of the cubes but so far I have not had the time. With the development requests that end up on my lap and various fire fighting that I end up doing important things always seem to slip through the cracks. Anyway enough complaining, this time I tried something new, I restarted the DW SQL Server, low and behold the product dimension started processing. I swear I looked at every process running on that box and couldn't find anything that looked amiss. I am so glad I did try it because this is way easier than trying to restore and reprocess things for the last few days...WOW!! I am so relieved if it wasn't so late I'd party like a rockstar...
<return xmlns="urn:schemas-microsoft-com:xml-analysis">
<results xmlns="http://schemas.microsoft.com/analysisservices/2003/xmla-multipleresults">
<root xmlns="urn:schemas-microsoft-com:xml-analysis:empty">
<Exception xmlns="urn:schemas-microsoft-com:xml-analysis:exception" />
<Messages xmlns="urn:schemas-microsoft-com:xml-analysis:exception">
<Error ErrorCode="3238002695" Description="Internal error: The operation terminated unsuccessfully." Source="Microsoft SQL Server 2005 Analysis Services" HelpFile="" />
<Error ErrorCode="3238002695" Description="Internal error: The operation terminated unsuccessfully." Source="Microsoft SQL Server 2005 Analysis Services" HelpFile="" />
<Error ErrorCode="3238395904" Description="OLE DB error: OLE DB or ODBC error: Transaction (Process ID 64) was deadlocked on thread | communication buffer resources with another process and has been chosen as the deadlock victim. Rerun the transaction.; 40001." Source="Microsoft SQL Server 2005 Analysis Services" HelpFile="" />
<Error ErrorCode="3240034316" Description="Errors in the OLAP storage engine: An error occurred while the dimension, with the ID of 'Vw Dim Product CUBE', Name of 'Product' was being processed." Source="Microsoft SQL Server 2005 Analysis Services" HelpFile="" />
<Error ErrorCode="3240034317" Description="Errors in the OLAP storage engine: An error occurred while the 'Subdepartment Number' attribute of the 'Product' dimension from the 'Cubes' database was being processed." Source="Microsoft SQL Server 2005 Analysis Services" HelpFile="" />
<Error ErrorCode="3239837698" Description="Server: The operation has been cancelled." Source="Microsoft SQL Server 2005 Analysis Services" HelpFile="" />
</Messages>
</root>
</results>
</return>
Common sense is admitting there is cause and effect and that you can exert some control over what you understand.
|
|
|
|
|
Here is another script for getting the run time of a SQL agent job
use msdb
SELECT
sysjobs.name,
step_id,
step_name,
isnull(nullif(convert(int,(run_duration/10000 * 60) + (run_duration/100%100) + (ceiling(run_duration%100/60.0))),0),1) as DurationMinutes,
run_date
FROM
sysjobs
join sysjobschedules on sysjobs.job_id = sysjobschedules.job_id
join sysjobhistory on sysjobs.job_id = sysjobhistory.job_id
WHERE
sysjobs.name ={job name}
Common sense is admitting there is cause and effect and that you can exert some control over what you understand.
|
|
|
|
|
Here is a script to send out a scheduled report from SSRS
USE ReportServer
DECLARE @EXEC_ID varchar(255)
DECLARE @RPT_NAME varchar(255)
SET @RPT_NAME='{Report Name}'
SET @EXEC_ID =(
SELECT TOP 1
a.ScheduleID
FROM
ReportServer.dbo.ReportSchedule a
INNER JOIN ReportServer.dbo.Subscriptions d ON a.SubscriptionID = d.SubscriptionID
INNER JOIN ReportServer.dbo.Catalog e ON d.report_oid = e.itemid
WHERE e.name = @RPT_NAME
)
EXEC msdb.dbo.sp_start_job @job_name = @EXEC_ID;
Common sense is admitting there is cause and effect and that you can exert some control over what you understand.
|
|
|
|
|
This script loops through all of the tables in a db and drops them
-----------------------------------------------------------
Create Table #results
(
tbl_Name varchar(255),
earliest datetime,
newest datetime,
countOf int
)
-----------------------------------------------------------
DECLARE @tbl AS VARCHAR(75)
DECLARE @sql AS VARCHAR(4000)
DECLARE @fnd AS VARCHAR(25)
SET @fnd = 'fact'
DECLARE TblCursor CURSOR FOR
SELECT [name] as tbl_name
FROM sys.Tables
WHERE [name] like '%' + @fnd + '%'
OPEN TblCursor
FETCH next FROM TblCursor INTO @tbl
WHILE @@fetch_status = 0
BEGIN
SET @sql = 'DROP TABLE ' + @tbl
PRINT @sql
INSERT #results EXECUTE(@sql)
IF @@error <> 0
INSERT INTO #results (tbl_Name) VALUES (@tbl)
FETCH next FROM TblCursor INTO @tbl
END
CLOSE TblCursor
DEALLOCATE TblCursor
-----------------------------------------------------------
SELECT * FROM #results ORDER BY newest
-----------------------------------------------------------
TRUNCATE TABLE #results
DROP TABLE #results
Common sense is admitting there is cause and effect and that you can exert some control over what you understand.
modified on Friday, September 24, 2010 9:51 AM
|
|
|
|
|
Friend of mine just asked about how to use a case statement in SQL. Figured I would share it here as well. This is just some simple stuff for beginners, nothing fancy or complicated.
CREATE TABLE #tmp(i int, s varchar(255))
DECLARE @iRowCnt INT, @i INT, @sUID VARCHAR(255)
SET @iRowCnt = 1000
SET @i = 0
WHILE @i <= @iRowCnt
BEGIN
SET @sUID = CAST(NewID() as Varchar(255))
INSERT INTO #tmp(i, s) values(@i, @sUID)
SET @i = @i + 1
END
SELECT
i ,
CASE WHEN i % 2 = 1
THEN
'Got Nothing'
ELSE
s
END AS Case_Test
FROM
#tmp
SELECT
i,
s
FROM
#tmp
WHERE
(CASE WHEN i % 2 = 1
THEN
'Got Nothing'
ELSE
s
END) = 'Got Nothing'
DROP TABLE #tmp
Common sense is admitting there is cause and effect and that you can exert some control over what you understand.
|
|
|
|
|
Having a hard time sleeping tonght. So I logged into the computer to putz around and wound up here. I just noticed I've been trolling CP for over 6 years off and on...wow
Need to find more time to come back and participate.
Common sense is admitting there is cause and effect and that you can exert some control over what you understand.
|
|
|
|
|
It's now been a few weeks since I put the backup Integration Services Package in place. All has been working just fine, as a matter a fact, our 360 cubes compress down to 90 gig. Not too shabby...
I still need to test a restore, which has me worried. Our test system doesn't have the SAN space available to do a restore to it. I have a plan in place to get that rectified but won't be able to do it until later down the road...
Common sense is admitting there is cause and effect and that you can exert some control over what you understand.
|
|
|
|
|
Lot has happened since I last posted. I now have all of our multi-dimensional databases. Plus an Oracle system, WOW feels like I could spend the rest of my life learning these apps. All of new apps are on our AIX platform, which means a whole new learning curve for me.
So instead of talking about that technology for the time being I’m going to take a second here and brag for a moment. I bought a Canon 50D camera about two months back, and then a few weeks ago I picked up an L-Series 100-400mm Lens. I am by no stretch of the imagination a photographer but I do enjoy getting outside to take pictures. It just so happens I can afford what ever toys I want.
I post my stuff at http://sentinalsoftware.com/images/[^] for anyone that’s interested. There are some cool pictures, but most will be pretty lame to anyone else (like I said I just like getting outside to take the pictures, doesn’t always have to be a killer shot for me to shoot it)
That's all for now, gotta get back to work.
Common sense is admitting there is cause and effect and that you can exert some control over what you understand.
modified on Thursday, July 2, 2009 12:15 AM
|
|
|
|
|
Okay, so this weekend I spent some time working with the cubes in an attempt to set the production environment up on another server with some sort of automation and not reprocessing.
1:
Goal: Use the Cube Synchronization functionality of SSAS
Result: Failed, the sum 200 gig our cubes are requires too much time to sync before they start changing
2:
Goal: Back up & Restore
Result: This has worked in test, but again I’ve had timing issues. The backup can start just after cubes finish (this is a candidate for Saturday after processing).
3:
Goal: Script out creating a copy of the Cube structure (Preferred method by Ralph), but requires the cubes then to be fully processed after that.
Result: This is how I created a copy on test, it works but is way to time consuming.
Right now the backup seems to be the best method, because then I can have the restore happend to as many servers as I want (I have two servers that I can use for this). Alas, nothing has quite worked out yet.
Common sense is admitting there is cause and effect and that you can exert some control over what you understand.
|
|
|
|
|
I just got home from hanging out with friends listening to an old mp3 player I found in my mess of things sitting here getting ready for bed. I'm really surprised at what’s on the mp3 player, some really good music and some well not so much. I haven't used this mp3 player in probably 4 years (it's an old iRiver 256meg bought quite sometime ago). I'm really surprised it still works after the way I found it stored (sitting under a sledge hammer). I had thought about buying a new one, but knew I wont use while biking as I just enjoy forgetting everything around me while on the bike, same with hiking. Though it might be a good idea at work. It's not a loud environment, actually to the contrary, it's really quite. There are some little noises that just drive me nuts. I used to sit and code while listening to classical music. I honestly don’t remember why I stopped coding with music, I like tuning out the world while I focus on the task at hand. Think I’m going to use this mp3 player for a while see how it goes, if I end up really using it then I will start looking for a new one with a smidgen more capacity.
Okay, off to the real reason I wanted to blog today. I created a job yesterday to sync my cubes to my test server, thought for sure they would sync up given enough time (my cubes in total are only about 200 gig). The job was setup to start at 06.30 which was plenty of time after nightly processing and well before the 13.00 processing that fires off. I randomly checked on the jobs occasionally throughout the day; certainly that was plenty of time for that little bit of data to sync? As it turns out nope its not... I had already built the cubes on the test server, sure they where out of date, but not that far. Don't get it...
On Friday, I did a backup of the test system (which has about a week old copy of the cubes that I manually built). That backup only took about 4 hours to make. I’m now thinking that in order to keep test current with prod is to do a daily backup, I’m not sure how that will affect our users (if it does at all then I wont be able to do it daily which isn’t a deal breaker).
I am really disappointed in Microsoft about the sync issue, why shouldn't have finished in 6.5 hours? Or at least been done enough that it didn’t need the production server any more??? The whole point of cubes is to allow you to present huge amounts of data to users. The Microsoft engineers dropped the ball on this one. That or I completely missed the boat here. hmmmmmmm
Common sense is admitting there is cause and effect and that you can exert some control over what you understand.
|
|
|
|
|
|
Okay so I have a new challenge on my plate. First off I know it sounds like I’m constantly bitching about things, but blogging is really is just a good place to complain. No one really cares about blog anyways.
So this weekend our SQL Server Reporting Server decided to up and roll over. The web portal was up to serve reports, but every one of the reports would error with some type of permissions issue. Nothing had changed on the server, and I mean nothing that I can tell has changed, it’s the weirdest thing. Now mind you I’m not the one who handles reports or deals much with the reporting services, my boss just said okay everyone, time to get up at 05.20 on Sunday morning and troubleshoot this issue (which is when it was first noticed). Turns out, that the report server started having health issues Saturday morning. The web front end can’t connect to its internal dB, the database for reporting services is up and running just fine it’s the internal engine that appears to be off in lala land.
So much so in fact, we can’t connect to it to do any report admin or push new reports. It’s really got me perplexed. I’ve not yet found a resolution for the issue and will post it as soon as I do, but for the time being we got past the outage by opening permissions up on the DW dB side.
For what its worth, here is the error message I’m seeing in the event log.
Event Type: Error<br />
Event Source: Report Server (MSSQLSERVER)<br />
Event Category: Management <br />
Event ID: 107<br />
Date: 4/20/2009<br />
Time: 9:30:02 AM<br />
User: N/A<br />
Computer: xxxx<br />
Description:<br />
Report Server (MSSQLSERVER) cannot connect to the report server database.<br />
<br />
For more information, see Help and Support Center at http:
There is also this weird error message, but its been occurring for quite some time now
Event Type: Error<br />
Event Source: DCOM<br />
Event Category: None<br />
Event ID: 10016<br />
Date: 4/20/2009<br />
Time: 11:40:44 AM<br />
User: NT AUTHORITY\NETWORK SERVICE<br />
Computer: xxxx<br />
Description:<br />
The application-specific permission settings do not grant Local Activation permission for the COM Server application with CLSID <br />
{BA126AD1-2166-11D1-B1D0-00805FC1270E}<br />
to the user NT AUTHORITY\NETWORK SERVICE SID (S-1-5-20). This security permission can be modified using the Component Services administrative tool.<br />
<br />
For more information, see Help and Support Center at http:
I’m suppose to be working on a new Cube, and modifications to a couple of existing cubes, but that’s been put on hold while I try and come up with a solution to this issue. Gotta love working in a small team eh?
Common sense is admitting there is cause and effect and that you can exert some control over what you understand.
|
|
|
|
|
For the last two days I've been battling with a measure group in one of my cubes and an external report not matching up. Redundant reporting you bet, but different users use different reporting interfaces. Either way, numbers were not matching up. Actually from the outward appearance the totals for the affected measure group never changed over the last 5 days which is well just flat wrong, not only do they get processed at night but get processed by a job that I setup just for this issue.
Now, when I tear into the numbers by SKU they all match up perfectly, it was the rollup that was causing the issue. For the life of me I was going nuts. It just didn't make any sense. To make matters worse, my co-worker who knows all the data structures is no longer with the company (a contractor who's contract was up). I do have his email address and asked him for help, but based on what I was seeing and the numbers it just didn't make any sense to either one of us.
So tonight I setup down time for the cubes so I could play with them at my will. First thing I did was exclude SKUs that didn't have a certain value, process the cube and go number crunching. Suddenly everything was in alignment!!! Excluding Zeros caused the rollups to work correctly. WTF? I don't get, perhaps I'm not suppose to but it now works.
Oh and in case you're wondering, no the SSRS report doesn't exclude the zero values. It uses about the exact same logic (so much so I pulled that out to do number matching the other day)
On to the next challenge...
Common sense is admitting there is cause and effect and that you can exert some control over what you understand.
|
|
|
|
|
Another long story, I will try and keep it brief because I see my last batch job finished and I would really like to get some sleep.
Pesky Users, they actually want to view the data...
Last week, we prepped for a calculated member change. The change came down from high above signed off on and was deemed important enough that Change Management would have to wait. So with that in mind we checked, and double checked the code. Made sure everything looked good, then I closed my laptop for the day to drive home (I have a laptop that I carry around and connect to my workstation at my cube where I do all of my work from). When I got home I dutifully logged in and checked everything again, then had some chow. At the assigned time, I tried pushing my changes. Only to find Visual Studio BI had decided that it didn't want to push my changes back out to the server, but my local machine. Fine be like that (more on this later) so I change VS to point back to the Server, and try the Push again. Nope, Nada, this time it's forgotten the impersonation password. Ugg! That's kept in a secure place in the office! Here I am thinking sh*t I'm going to have to call my co-worker and have him meet me at the office so we can get the Password. Thankfully, he's had this particular issue so many times he's flat remembered the password. So as I'm wiping the sweat off my brow I make the push, low and behold it takes. No errors, Sweet! I rejoiced finally. Just to be sure everything is fine, I check ProClarity, hey look at that, the change is there. Time for more important things (xbox )
Come next morning, bright and early after I send the email giving the thumbs up to start hitting the cubes, I start getting reports that no one can view the Cubes. Weird, open ProClarity, double check, yup the Cubes are there. Call the user back (one of our super users who doesn't cry foul unless there is indeed something wrong). She reports that indeed no cubes are present. I had her do some basic troubleshooting. Nope no change. Hmpft. WTF.
Call my coworker, he thinks the password got borked, time to fetch a clean copy of the cubes and re-push them with the password. No change.
Several hours go by, trying everything imaginable....
My coworker looks over at me and says hey pull open the "roles" object. What permissions are there on the server? WTF, none of the roles have any permissions....Somewhere along the line VS decided cubes don't need permissions and yanked them out. WTF!!!!! So I updated the permissions, called a user, have her log in and everything is up.
I've used Visual Studio for years and years; in that time I have learned that there are little quarks here and there that you have to live with. It's just part of life. However, when you add the Business Intelligence add-ons, you're in for a whole new world of hurt. VS is even flakier than ever. Things like the above issues plus more honestly it's enough to make me start hating VS. I will grant you, the two words Business and Intelligence should probably never been used in the same sentence but this is insane.
If you ever find yourself in a position where the users cannot view the cubes check the roles object, it's likely the users have lost permissions.
Common sense is admitting there is cause and effect and that you can exert some control over what you understand.
|
|
|
|
|
While I'm sitting here watching some data process I figured I would share with the rest of the world some of the umm interesting things that have come up while dealing with SQL Server Analysis Services (SSAS). Actually, something I would like to do it blog more often about the topic, it will be interesting to see if any else reads these entry's. Here we go
Dimension processing... (Part Two)
In part one of my Dimension processing post, i spoke of losing a Dimension (a Dim that's related to half the schema). Here is an interesting little error I encountered while re-processing that same Dimension)
File system error: The background thread running lazy writer encountered an I/O error. Physical file: \\?\C:\Program Files\Microsoft SQL Server\MSSQL.2\OLAP\Data\CB Cubes.0.db\Vw Dim Product CUBE.0.dim\172.Vw Dim Product CUBE.asstore. Logical file: .
<br />
File system error: A FileStore error from WriteFile occurred. Physical file: \\?\C:\Program Files\Microsoft SQL Server\MSSQL.2\OLAP\Data\CB Cubes.0.db\Vw Dim Product CUBE.0.dim\172.Vw Dim Product CUBE.asstore. Logical file: . : The process cannot access the file because another process has locked a portion of the file. .
When this cropped up, I'm sure you can imagine where my stomach went. I was certain that maybe the RAID array was dying. Who could blame it, we abuse that server to no end. Even with 8 Cores and 16 gig of ram the poor thing is constantly seeing some type of abuse whether its an ETL package or our user base it just doesn't get much sleep.
Fortunately or unfortunately as it where, after careful review of the event logs, there was NADA about the disks or any I/O issues. Given I didn't want to see us lose any more time I called my coworker. Who started processing the Dim again. At the same time he was able to figure out that a backup job happened at the same time I got my error message. So it seems pretty clear that the backup just happened to be consuming the portion of the disk I was using.
While he was off reprocessing and figuring out the cause of the error (turns out he cheated and had a schedule of when backup jobs hit this server) I was off on Google looking for answers. It was amazing at how many others had the same issue, but never figured out what the cause of it was. There are some people who have Dimensions so large they actually exceed the 4 gig limitation on Dims. I'm glad to say our Dims aren't that large (yet). So if you ever see the above error messages, stop and look at what's happening on the server, you just never know what those pesky sys admins are up to.
In the end, the Dim processed just fine and has been healthly since...
Common sense is admitting there is cause and effect and that you can exert some control over what you understand.
|
|
|
|
|
SSAS from the trenches
While I'm sitting here watching some data process I figured I would share with the rest of the world some of the umm interesting things that have come up while dealing with SQL Server Analysis Services (SSAS). Actually, something I would like to do it blog more often about the topic, it will be interesting to see if any else reads these entry's. Here we go
Dimension processing...
A hard lesson I learned over the weekend was that any time you fully process a Dimension every fact table that relies on that Dim, needs to be processed. When your CUBES are over 500 Gig that can take some time to do (I am still processing the cubes, and probably will be till this weekend).
Now as it turns out, after investigation, no matter what we would have had to reprocess everything. This weekend was DST for us, and because someone previous to me screwed up the step order in a job. We have two SQL jobs (running ETL refreshes of Dims and Cubes) that refresh our Analysis Server. The first one updates the Dims, the second updates the cubes. The second one does have a couple of waits setup so it doesn't trample on anything else, but its first step is to restart SSAS. When SSAS got restarted, it must have been at a critical point because I couldn't get the Dim that had been processing to process and come up clean. It was erroring out. See ever so enlightening error message below
Internal error: The operation terminated unsuccessfully. Internal error: The operation terminated unsuccessfully. OLE DB error: OLE DB or ODBC error: Transaction (Process ID 73) was deadlocked on thread | communication buffer resources with another process and has been chosen as the deadlock victim. Rerun the transaction.; 40001. Errors in the OLAP storage engine: An error occurred while the dimension, with the ID of 'Vw Dim Product CUBE', Name of 'Product' was being processed. Errors in the OLAP storage engine: An error occurred while the 'First Sales Actual Date' attribute of the 'Product' dimension from the 'CB Cubes' database was being processed. Internal error: The operation terminated unsuccessfully. Internal error: The operation terminated unsuccessfully. Internal error: The operation terminated unsuccessfully. Internal error: The operation terminated unsuccessfully. Internal error: The operation terminated unsuccessfully. Internal error: The operation terminated unsuccessfully. Internal error: The operation terminated unsuccessfully. Internal error: The operation terminated unsuccessfully. Internal error: The operation terminated unsuccessfully. Internal error: The operation terminated unsuccessfully. Internal error: The operation terminated unsuccessfully. Internal error: The operation terminated unsuccessfully. Internal error: The operation terminated unsuccessfully. Internal error: The operation terminated unsuccessfully. Internal error: The operation terminated unsuccessfully. Internal error: The operation terminated unsuccessfully. OLE DB error: OLE DB or ODBC error: Operation canceled; HY008. Errors in the OLAP storage engine: An error occurred while the dimension, with the ID of 'Vw Dim Product CUBE', Name of 'Product' was being processed. [snip four more pages of about the same thing]
This has been one really hard week, myself and a coworker have been manually babysitting the jobs all week. The first thing on the order was to get the cubes back up and at least populated with a month's worth of data. Now I'm following up and processing everything (the users are clamoring for the data). Actually funny story, on Tuesday, the CEO caught me in a meeting, and called me out asking how the cube processing was coming along. Now understand our CEO is a really nice person, I just don't care how nice she is, it's never a good thing in my opinion for someone of her nature to know your name and what your working on. Especially when it's something that's broken. Oh well, at least she took it in stride and was more worried about the hours I'm putting in getting things back (yea she's that nice)
If you like that post, then there is more to come. I would like to keep things kind of on topic, so one issue per post
Common sense is admitting there is cause and effect and that you can exert some control over what you understand.
|
|
|
|
|
Yea know, I have a ton of scripts that I've written. I'm sure someone else could benifit from one of them, sure would be nice if there was an easy way to format it with color for posting on the internet.
I tried ≶pre lang="vb.net"> when posting here on my blog, but alas it didn't work. If anyone reads this, please reply and let me know if there is a handy tool out there to format code.
Thanks
Common sense is admitting there is cause and effect and that you can exert some control over what you understand.
|
|
|
|
|
Interesting, I came here wanting to blog about my new job but don't see a create new post? Wonder if it has anything to do with FF3 and IE8 (checked with both browsers. Then again, maybe its just I'm blind (which is likely the case).
Common sense is admitting there is cause and effect and that you can exert some control over what you understand.
|
|
|
|
|
As it turns out, Im not blind, there is a bug. Interesting
I can't add a new message to my blog[^] Guess I will just have to wait meh.
Common sense is admitting there is cause and effect and that you can exert some control over what you understand.
|
|
|
|
|
Okay so I can’t wait until the bug is fixed, so I’m just going to reuse another post.
A lot has happened in since my last posting here. For the last year and a half, I’ve been insanely busy with work. Actually, I’ve been so busy it doesn’t even feel like that long. As I previously posted, I got hired on as a Data Analyst, working with SQL Server Analysis Services (SSAS). As it turns out of course there is more to the job than just my cubes, the guy who is responsible for the integration services portion of the data warehouse, seems to know less than I do and he’s been involved in data warehousing for far longer than I’ve really know of the concept. We have a part-timer who handle the report creation and maintenance who is a good egg.
If you don’t know what a data warehouse is, simple it’s nothing more than a relational database that has been demoralized with the data in it pre-aggraded. I wont carry on and on about what a Data warehouse is, much less SSAS (at least not right now) as there is a ton of information about the concepts involved out on the internet. Suffice it to say, there is some really cool sh*t involved here that should keep me entertained for a long time.
Couple of primer websites if you’re interested.
http://www.ssas-info.com/[^]
http://channel9.msdn.com/tags/Business+Intelligence/[^]
I didn’t just come here to brag about the new job, but comment about a few other things. For one, I just built a new computer.
Specs
Intel quad duo, 12m l2 cache
8 gig ram (1066)
600 gig drive
Dual dvi 512meg video card
(2) 20” Acer LCDs
There’s a lot of horse power packed into this little machine. I plan on using as a test bed for working with SSAS once I finish installing all of the stuff I will need on it (thank goodness for my MSDN subscription, a freebie from work; woot). I was sitting here today trying to figure out a nice setup for my work laptop plus my home machine and the two monitors. When it dawned on me I really need one of the LCDs attached to the work laptop because its 15” screen sucks (actually it’s nice, but it’s tiny compared to the 20s). Now having two keyboards and mice setup for the two setups would be unbearable.
The solution?
I experimented with Synergy several years ago when it was in its infancy. Didn’t really need it back then and a KVM was just fine for what I was doing then anyways. Plus it was rather unstable. A lot of time has passed and my needs have changed so I went back to it. Wow, install and config was a breeze, and it just works the way I want it to. So here I sit typing on my computer while my laptop sits up connected to my workstation at work with outlook open. Just moving the mouse I can flip between the two machines. This is really the cat’s ass.
Link to Synergy
http://synergy2.sourceforge.net/[^]
On a personal note, an ex-girlfriend of mine texted me today requesting assistance moving out of her boyfriend’s house. Now I don’t know what the deal is with those two people, I do know though for sure, that woman has a certain control over me that no one else could possible imagine. No matter how much time passes or who I’m with, I still get all gaga over her every time we talk. She wants to get together sometime this week and chat. Don’t know for sure what that means, but I’m all out of my usual sorts because of it. Women sigh
Well, if you read this entire post, then I feel sorry for you. Just a lot of stuff I wanted to dump out heck, there is even more that I haven’t yet but want to in some form or another. I’m working with some really cool technology at work, home life is pretty decent now if I could just find the energy to clean my apartment.
Common sense is admitting there is cause and effect and that you can exert some control over what you understand.
|
|
|
|
|
I don't know why, but I love it when the CP site yells at me
---------------------------
Windows Internet Explorer
---------------------------
This message is very long. Long messages increase download times for those with slow connections. Are you sure you want to post a message this long?
---------------------------
OK Cancel
---------------------------
It's almost as if Chris and team are trying to say Im long winded.
Common sense is admitting there is cause and effect and that you can exert some control over what you understand.
|
|
|
|
|
Here is a script for mapping a printer with vbscript
Option Explicit
' 1: Maps Printer
Main
Sub Main()
Call MapPrinter("\\UNC\Path", true)
End Sub
Sub MapPrinter(sPrinter, bDefault)
Dim objNet
' *** Map the printer for the user *** '
Set objNet = CreateObject("WScript.Network")
objNet.AddWindowsPrinterConnection sPrinter
If bDefault Then
objNet.SetDefaultPrinter sPrinter
End If
Set objNet = Nothing
End Sub
Common sense is admitting there is cause and effect and that you can exert some control over what you understand.
|
|
|
|
|
Here's another script that's no longer used. Again, posting it for someone else to use
Option Explicit
Call Main()
Sub Main()
Dim sPath, sSelect, dDate, sResults, dFileDate
Dim objWMIService, objChild, objFile
Dim objFSO
sPath = GetIniValue("Path")
if (Len(sPath)) < 1 Then
iCounter = 0
Exit Sub
End if
Set objWMIService = GetObject("winmgmts:{impersonationLevel=impersonate}!\\.\root\cimv2")
dDate = DatePart("yyyy", DateAdd("d", GetIniValue("NumDays"), Date))
If DatePart("m", DateAdd("d", GetIniValue("NumDays"), Date)) < 10 Then
dDate = dDate & "0" & DatePart("m", DateAdd("d", GetIniValue("NumDays"), Date))
Else
dDate = dDate & DatePart("m", DateAdd("d", GetIniValue("NumDays"), Date))
End If
If DatePart("d", DateAdd("d", GetIniValue("NumDays"), Date)) < 10 Then
dDate = dDate & "0" & DatePart("d", DateAdd("d", GetIniValue("NumDays"), Date))
Else
dDate = dDate & DatePart("d", DateAdd("d", GetIniValue("NumDays"), Date))
End If
dDate = dDate & "000000.000000+000"
sSelect = "Select * from CIM_DataFile where Path= '" & sPath & "' and CreationDate <= '" & dDate & "'"
Set objFSO = CreateObject("Scripting.FileSystemObject")
Set objChild = objWMIService.ExecQuery(sSelect)
if not IsNull(objChild) then
For Each objFile in objChild
Call objFSO.Delete(objFile.Name, true)
Next
End If
Set objFSO = Nothing
Set objChild = Nothing
Set objWMIService = Nothing
Set objFile = Nothing
If Err.Number <> 0 Then
End If
End Sub
Function GetIniValue(sID)
On Error Resume Next
Dim objFSO
Dim objTextFile
Dim sValue
Dim sKey, sFileName
Set objFSO = CreateObject("Scripting.FileSystemObject")
sFileName = GetScriptName(".ini")
If (objFSO.FileExists(sFileName) = False) Then
msgbox "No configuration file defined!"
GetIniValue = vbNullString
set objFSO = Nothing
Exit Function
End If
Set objTextFile = objFSO.OpenTextFile(sFileName)
Do While Not objTextFile.AtEndOfStream
sValue = objTextFile.ReadLine()
If InStr(sValue, "=") Then
sKey = Split(sValue, "=")
If sKey(0) = sID Then
GetIniValue = sKey(1)
End If
End If
Loop
objTextFile.Close
Set objTextFile = Nothing
set objFSO = Nothing
If Err.Number <> 0 Then
End If
End Function
Function GetScriptName(sExt)
On Error Resume Next
GetScriptName = Replace(Wscript.ScriptName, ".vbs", sExt)
If Err.Number <> 0 Then
End If
End Function
Common sense is admitting there is cause and effect and that you can exert some control over what you understand.
|
|
|
|
|