Feed Icon  

Contact

  • Bryant Likes
  • Send mail to the author(s) E-mail
  • twitter
  • View Bryant Likes's profile on LinkedIn
  • del.icio.us
Get Microsoft Silverlight
by clicking "Install Microsoft Silverlight" you accept the
Silverlight license agreement

Hosting By

Hot Topics

Tags

Open Source Projects

Archives

Ads

Previous Page Page 2 of 7 in the SharePoint category(RSS) Next Page

SQL Server Reporting Services SP2 Released

Posted in Reporting Services | SharePoint at Monday, 25 April 2005 05:57 Pacific Daylight Time

Via Andrew Watt:

Service Pack 2 for Reporting Services 2000 is available for download.

Visit SQL Server 2000 Reporting Services Service Pack 2.

Bug fixes in SP2 are listed at Microsoft Knowledge Base article 889640.

I'll be interested to hear what people think of the new web parts versus the RsWebParts. The fact that they are included out-of-the-box makes a big difference in what people use.

Things to Read and Watch

Posted in Sql and Xml | SharePoint | General | Avanade at Tuesday, 15 March 2005 06:33 Pacific Standard Time

Since I'm starting at Avanade on Monday, I've been doing some reading and watching to prepare myself a little for the job. Namely, I've been watching all the Enterprise Library webcasts.

I also wanted to make sure I find time in the near future to read and watch:

So this is a reminder for myself so that I can find all these later on...

Update: There are also hands-on labs for that go with the Enterprise Library webcasts here [via Scott Densmore]. After watching the Logging Application Block webcast I decided to add it into an existing application so that some of the other developers could get emails when the processed failed. It took me about five minutes to add it. Watching the webcast made the job easy for me.

What do you know

Posted in SharePoint at Thursday, 24 February 2005 05:36 Pacific Standard Time

From Tim Heuer's blog (via Tony Dowler):

inspired by scott hanselmen's post about What Great .NET Developers Ought to Know, as well as the rush of interviews i've been conducting lately to fill sharepoint positions, i started compiling a list of questions for what a sharepoint consultant ought to know...here it is...

Go read it to see how you measure up. Some good questions in there that will help you see what areas of SharePoint you're weak in (if any). It seems like a pretty well-rounded list of questions.

If you do really well on the SharePoint questions and feeling like you know everything go read Scott's What Great .NET Developers Ought to Know and it will probably bring you back down to earth :)

Single Sign-On, Impersonation, and SqlConnections

Posted in SharePoint | ASP.Net/Web Services at Wednesday, 23 February 2005 10:14 Pacific Standard Time

This morning I came across this article: Impersonation, Single Sign-on, and SPS. It is a very interesting article that lays out how to use Single Sign-on with impersonation. So the first thing I did was to get Single Sign-On setup using this MSDN article which was linked in the article. Wow! That was pretty easy. I actually have SSO working for the first time and now it seems pretty simple.

The next step for me was to take the sample code and try it out. I created a new WebPart library and created my new SSO sample webpart. Everything was setup just like the article (I think) but when I ran it I got the dreaded:

Login failed for user '(null)'. Reason: Not associated with a trusted SQL Server connection.

Ok. So why doesn't this work. Well after doing some testing myself (using Page.User.Identity and WindowsIdentity.GetCurrent().Name) I came to the conclusion that something doesn't work, although I'm not sure what it is. It must work in some situations because Jay Nathan has done a lot of work on this (here and here), this article references it, and so does Barry's Blog here on January 24, 2005 (could we get a permalink Barry?). So these guys must be using it and it must be working.

Patrick seems to come to a different conclusion here. He seems to be saying that impersonation doesn't work and that it is easier to just use COM+ instead. A third example of impersonation can be found at this MSDN article. All three of these examples do the same thing, and in all three cases I get the same error message above when I try to connect to the database using a trusted connection.

I can solve Patrick's problem by creating a new WindowsPrincipal object and assigning it to the current context (after making a copy of the current IPrincipal in order to return things on undo). Here is an example of how to do this.

1) First you will need to change the Impersonate method on the Impersonator class:

public WindowsIdentity Impersonate()
{
    // authenticates the domain user account and begins impersonating it
    WindowsIdentity id = this.Logon();
    this.impersonationContext = id.Impersonate();
    return id;
}

2) Next you will need to store the current Principal and then set the new one.

IPrincipal p = base.Context.User;
....
WindowsPrincipal wp = new WindowsPrincipal(Impersonater.Impersonate());
base.Context.User = wp;

3) You can then do stuff using the SharePoint object model since the Context.User has now been replaced with the impersonated user. When you're done you put back the orginal user.

Impersonater.Undo();
base.Context.User = p;

Using this code both the WindowsIdentity.GetCurrent().Name and Context.User.Identity.Name return the name of the impersonated user.

All this to say that it still doesn't seem to accomplish what I need to accomplish. I still get that error and I'm really not sure why. Any ideas?

RsWebParts 1.3

Posted in Reporting Services | SharePoint at Tuesday, 22 February 2005 14:03 Pacific Standard Time

Just finished uploading the RsWebParts 1.3 package to the project site. I added in a quick fix for the parameters issue. The fix is that you can now include the standard Reporting Services toolbar and parameters which should fix the problem. It isn't as flexible and it takes up some of the report view space, but it works until I come up with something better.

Thanks to everyone for all the great feedback, especially David Korn for testing out the new version.

RsWebParts 1.2

Posted in Reporting Services | SharePoint at Friday, 18 February 2005 05:19 Pacific Standard Time

Not a very exciting release. You can get the installer and the source code (and the DwpGenerator binary and source) at the RsWebParts SourceForge Project.

I'm still working on resolving the parameters issue. I think I'm close on a solution but there were several people who needed the source files ASAP so I went with a much simplier release for now.

Automating WebPart Library Builds

Posted in SharePoint | ASP.Net/Web Services at Friday, 18 February 2005 03:55 Pacific Standard Time

Back when I was working on version 1.1 of the RsWebParts I setup a build script for my project which I think is pretty useful. It automates the process of creating DWP files and packaging the project using the WP Packager tool.

The first step in automating this process was to create a console application that I called the DwpGenerator. I did this because I grew tired of forgetting to change version numbers or trying to look up the public key token of an assembly. The DwpGenerator is a simple application that takes a web part assembly and generates a dwp file for each web part it finds in the assembly. It uses .Net Attributes to gather other information such as title and description. These are set in the web part assembly. For example, here is the RsExplorer web part's class declaration:

[DefaultProperty("ServerUrl"),
 ToolboxData("<{0}:RsExplorer runat=server>"),
 XmlRoot(Namespace="Bml.RsWebParts"),
 Description("Explore a Reporting Services Server."),
 Title("RS Explorer"),
 PartImage("_WPR_/ReportExplorer.gif")]
public class RsExplorer : RsBasePart, IRowProvider, 
IPostBackEventHandler, IDesignTimeHtmlProvider 
{
    ....
}

The DwpGenerator picks up the title and description from these attributes. The generator doesn't look for any specific class of attributes, so you can create your own. It only looks at the name of the attribute to determine if it should use it. So for the above web part the dwp that gets generated looks like:


<WebPart xmlns="http://schemas.microsoft.com/WebPart/v2">
  <Title>RS ExplorerTitle>
  <Description>Explore a Reporting Services Server.Description>
  <Assembly>Bml.RsWebParts, Version=1.2.0.0,Culture=neutral, 
PublicKeyToken=4fafef280eaa1b9cAssembly>
  <TypeName>Bml.RsWebParts.RsExplorerTypeName>
  <PartImageLarge>_WPR_/ReportExplorer.gifPartImageLarge>
WebPart>

The next part of the automation involves the wppackager tool. This tool is great for generating an installer program for web parts. The only drawback is that you need to copy every file into the same folder for the tool to work. However, it is pretty easy to automate copying files. So here is my post build event command line process for the Bml.RsWebParts:

c:\temp\DwpGenerator.exe "$(TargetPath)" "$(ProjectDir)"
copy "$(ProjectDir)\manifest.xml" "$(TargetDir)"
copy "$(ProjectDir)\*.dwp" "$(TargetDir)"
copy "$(ProjectDir)\images\*.gif" "$(TargetDir)"
copy "$(ProjectDir)\xsl\*.xsl" "$(TargetDir)"
copy "$(ProjectDir)\scripts\*.js" "$(TargetDir)"
"c:\program files\wppackager\wppackager.exe" "$(ProjectDir)\wppackager.xml"
del  "$(TargetDir)\*.xml"
del  "$(TargetDir)\*.gif"
del  "$(TargetDir)\*.dwp"
del  "$(TargetDir)\*.xsl"
del  "$(TargetDir)\*.js"

That is all there is to it. Now I can make changes to my project and when I build it everything is updated.

Google's Mini

Posted in SharePoint | General at Tuesday, 15 February 2005 04:11 Pacific Standard Time

Just ran across this today: the Google Mini.

Meet the Google Mini. Designed to help small and medium-sized businesses make the most of their digital assets, the Mini is a hardware and software search appliance that delivers the power and productivity of Google search across your organization’s documents and websites.

The Google Mini:

  • Indexes and searches up to 50,000 documents.
  • Works with over 220 different file formats, including HTML, PDF and Microsoft Office.
  • Can be set up in under an hour and requires minimal ongoing administration.
  • Costs $4,995 for all hardware and software, including a year of support and hardware replacement coverage.

The Google Mini 

This has some interesting implications for SharePoint. While it isn't as feature rich as SharePoint, it is definately a competitor in the information aggregation space.

Update: Sahil Malik  asks how this relates to SharePoint. Perhaps I should have clarified and said SharePoint Portal Server. In order to see the relation take a look at Microsoft's Top 10 Benefits of SharePoint Portal Server 2003.

Find and reuse timely and relevant information from systems and reports, and quickly locate and access documents, projects, and best practices by searching or browsing—all through the portal.

...

The industry-leading search technology in SharePoint Portal Server 2003 enables you to locate files, project plans, and best practices in file shares, Web sites, Microsoft Exchange Public Folders, Lotus Notes, Windows SharePoint Services sites, and databases instead of re-creating the wheel.

Now I'm not saying that the Google Mini is a direct competitor to SharePoint since SP offers much more than search. However, when it comes to information aggregation the Google Mini is clearly a competitor. I've no idea if it can search Lotus Notes or Exchange, but you could probably figure out a way to make it work.

SharePoint Recycle Bins

Posted in SharePoint at Monday, 14 February 2005 10:41 Pacific Standard Time

I recently read the MSDN article Add a Recycle Bin to Windows SharePoint Services for Easy Document Recovery by Maxim V. Karpov and Eric Schoonover. It was an interesting read, but I was pretty amazed at the lengths they had to go to in order to get something as simple as a recycle bin. Obviously, this biggest setback was the fact that

events are processed asynchronously. As a result, a registered event sink will only be notified about the document deletion after that document has already been deleted from the SQL Server backend database. As a result, the event sink can't simply copy the document to the recycle bin library because the deleted document no longer exists.

Sounds like a YASPQ (or YASQ if you prefer) to me. So in order to create the recycle bin they Maxim and Eric end up mirroring the document libraries in order to add the recycle bin functionality. While reading the article my mind couldn't help of trying to come up with a simplier method. Here is my own version of a SharePoint recycle bin (note: don't play around with your SharePoint databases if you don't know what you're getting yourself into. I always build and test my scripts out on test servers and sites and this is what I would call a pre-alpha release).

Connect to the _SITE database using Query Analyzer and run the following SQL Scripts.

1) Create the RecycledDocs table which is basically a copy of the Docs table:

-- create the RecycledDocs table
select *
into RecycledDocs
from Docs
where 1 = 0

 

2) Create an instead of trigger on the docs table that will redirects requests to delete documents.

create trigger doc_recycle on docs instead of delete
as
  delete  RecycledDocs
  where  id in (select id from deleted)
  insert into RecycledDocs
  select   id, siteid, dirname, leafname, webid, listid, doclibrowid,
    type, size, metainfosize, version, uiversion, dirty, cacheparseid,
    docflags, thicketflag, charset, timecreated, timelastmodified, 
    nexttolasttimemodified, metainfotimelastmodified, timelastwritten,
    setuppath, checkoutuserid, checkoutdate, checkoutexpires, checkoutsize,
    versioncreatedsincestcheckout, ltcheckoutuserid, virusvendorid,
    virusstatus, virusinfo, metainfo, content, checkoutcontent
  from   Docs 
  where   type = 0 and Id in (select Id from deleted)
  delete  docs
  where  id in (select id from deleted)

 

That is basically it. Of course, there is a lot of functionality that needs to be added from a user perspective, but as an admin you can now restore any document that gets deleted.

I'm not sure how this would affect Microsoft's support of a SharePoint installation, it may or may not. Your RecycledDocs table will also fill up (much like the Windows Recycle Bin) until you manually empty it. I'm hoping to do a longer write up on this along with either a web part or an ASPX page that can manage it.

So there is my simple method of creating a Recycle Bin in SharePoint. It isn't very fancy, but it works. :)

Backing up SharePoint

Posted in SharePoint at Wednesday, 09 February 2005 07:10 Pacific Standard Time

I looked into backing up SharePoint awhile back and didn't come away with a good solution. Recently MSDN published a new articled on How to Write a Back Up and Restore Application for SharePoint Portal Server 2003. The first time I looked at the article I skimmed it and didn't find it very useful and wrote it off. However, Brian E. Cooper blogged that it was Worth a Read!

So I went back and downloaded the sample and went though the article and I must admit, it was worth a read. There were several things I did like and some that I did not.

First, I liked the fact that I was able to do a full backup and restore of a SharePoint Portal. I built a Portal on a Virtual Server, backed it up, messed it up, and then restored it. It worked pretty well. Everything was restored except for the WebPart packages. For some reason I had to uninstall/reinstall my web part packages to get them to work. However, this isn't too much of an issue. The main thing was that the content and structure was restored successfully.

Second, the application that comes with the article is a sample application (as referenced by the application name SPSBackupSample). So, what is a SharePoint junkie to do with a sample application except to create a full blown application. I spent much of the day yesterday building my own version of the application written in C#. Everything was going well until I hit the section that backs up the indexes using the CSPSBackupClass. This class is used to backup the indexes to a stream using a method BackupToStream.

In the sample application (which is written in C++) you pass in an IStream pointer. In C# you should be able to do this (if I understand things correctly) by passing the pointer to a UCOMIStream object (using Marshal.GetIUnknownForObject). However, it doesn't seem to matter what I do, I get an exception from the CSPSBackupClass object. So I'm stuck at this point.

I could use the BackupToFile option but I was hoping to keep my sample compatible with the original sample. Any ideas?

Update: In case anyone is interested, here is the exception details that I'm getting:

System.UnauthorizedAccessException: Access is denied.
   at mssctlbm.CSPSBackupClass.BackupToStream(String bstrHostName, Object pStream, String bstrPassword, String bstrSite)
   at BackupSPS.BackupSite.Backup(String location, String backupName)

Update: Ok I was able to get this to work. I ended up writing a C++ console application that calls my C# class library. The problem was that I needed to make these two calls when my program started up:

    CoInitializeEx(NULL,COINIT_MULTITHREADED);
    
    HRESULT hr = CoInitializeSecurity(NULL, -1, NULL, NULL, 
                                      RPC_C_AUTHN_LEVEL_DEFAULT,
                                      RPC_C_IMP_LEVEL_IMPERSONATE,
                                      NULL, 
                                      EOAC_NONE, 
                                      NULL);
 

According to PInvoke.net you shouldn't call CoInitializeSecurity from managed code:

That's because the CLR will almost always call CoInitialize upon startup before execution enters your main method, and CoInitialize will implicitly call CoInitializeSecurity if it hasn't already been called.  Therefore, calling this from managed code will usually return RPC_E_TOO_LATE.

Once I switched to using a C++ app that called my class library everything started working. Once I get this completed I will probably post it up to a new project on source forge in case anyone is interested.

Previous Page Page 2 of 7 in the SharePoint category(RSS) Next Page