Feed Icon  

Contact

  • Bryant Likes
  • Send mail to the author(s) E-mail
  • twitter
  • View Bryant Likes's profile on LinkedIn
  • del.icio.us
Get Microsoft Silverlight
by clicking "Install Microsoft Silverlight" you accept the
Silverlight license agreement

Hosting By

Hot Topics

Tags

Open Source Projects

Archives

Ads

What do you know

Posted in SharePoint at Thursday, February 24, 2005 5:36 AM Pacific Standard Time

From Tim Heuer's blog (via Tony Dowler):

inspired by scott hanselmen's post about What Great .NET Developers Ought to Know, as well as the rush of interviews i've been conducting lately to fill sharepoint positions, i started compiling a list of questions for what a sharepoint consultant ought to know...here it is...

Go read it to see how you measure up. Some good questions in there that will help you see what areas of SharePoint you're weak in (if any). It seems like a pretty well-rounded list of questions.

If you do really well on the SharePoint questions and feeling like you know everything go read Scott's What Great .NET Developers Ought to Know and it will probably bring you back down to earth :)

Single Sign-On, Impersonation, and SqlConnections

Posted in SharePoint | ASP.Net/Web Services at Wednesday, February 23, 2005 10:14 AM Pacific Standard Time

This morning I came across this article: Impersonation, Single Sign-on, and SPS. It is a very interesting article that lays out how to use Single Sign-on with impersonation. So the first thing I did was to get Single Sign-On setup using this MSDN article which was linked in the article. Wow! That was pretty easy. I actually have SSO working for the first time and now it seems pretty simple.

The next step for me was to take the sample code and try it out. I created a new WebPart library and created my new SSO sample webpart. Everything was setup just like the article (I think) but when I ran it I got the dreaded:

Login failed for user '(null)'. Reason: Not associated with a trusted SQL Server connection.

Ok. So why doesn't this work. Well after doing some testing myself (using Page.User.Identity and WindowsIdentity.GetCurrent().Name) I came to the conclusion that something doesn't work, although I'm not sure what it is. It must work in some situations because Jay Nathan has done a lot of work on this (here and here), this article references it, and so does Barry's Blog here on January 24, 2005 (could we get a permalink Barry?). So these guys must be using it and it must be working.

Patrick seems to come to a different conclusion here. He seems to be saying that impersonation doesn't work and that it is easier to just use COM+ instead. A third example of impersonation can be found at this MSDN article. All three of these examples do the same thing, and in all three cases I get the same error message above when I try to connect to the database using a trusted connection.

I can solve Patrick's problem by creating a new WindowsPrincipal object and assigning it to the current context (after making a copy of the current IPrincipal in order to return things on undo). Here is an example of how to do this.

1) First you will need to change the Impersonate method on the Impersonator class:

public WindowsIdentity Impersonate()
{
    // authenticates the domain user account and begins impersonating it
    WindowsIdentity id = this.Logon();
    this.impersonationContext = id.Impersonate();
    return id;
}

2) Next you will need to store the current Principal and then set the new one.

IPrincipal p = base.Context.User;
....
WindowsPrincipal wp = new WindowsPrincipal(Impersonater.Impersonate());
base.Context.User = wp;

3) You can then do stuff using the SharePoint object model since the Context.User has now been replaced with the impersonated user. When you're done you put back the orginal user.

Impersonater.Undo();
base.Context.User = p;

Using this code both the WindowsIdentity.GetCurrent().Name and Context.User.Identity.Name return the name of the impersonated user.

All this to say that it still doesn't seem to accomplish what I need to accomplish. I still get that error and I'm really not sure why. Any ideas?

RsWebParts 1.3

Posted in Reporting Services | SharePoint at Tuesday, February 22, 2005 2:03 PM Pacific Standard Time

Just finished uploading the RsWebParts 1.3 package to the project site. I added in a quick fix for the parameters issue. The fix is that you can now include the standard Reporting Services toolbar and parameters which should fix the problem. It isn't as flexible and it takes up some of the report view space, but it works until I come up with something better.

Thanks to everyone for all the great feedback, especially David Korn for testing out the new version.

RsWebParts 1.2

Posted in Reporting Services | SharePoint at Friday, February 18, 2005 5:19 AM Pacific Standard Time

Not a very exciting release. You can get the installer and the source code (and the DwpGenerator binary and source) at the RsWebParts SourceForge Project.

I'm still working on resolving the parameters issue. I think I'm close on a solution but there were several people who needed the source files ASAP so I went with a much simplier release for now.

Automating WebPart Library Builds

Posted in SharePoint | ASP.Net/Web Services at Friday, February 18, 2005 3:55 AM Pacific Standard Time

Back when I was working on version 1.1 of the RsWebParts I setup a build script for my project which I think is pretty useful. It automates the process of creating DWP files and packaging the project using the WP Packager tool.

The first step in automating this process was to create a console application that I called the DwpGenerator. I did this because I grew tired of forgetting to change version numbers or trying to look up the public key token of an assembly. The DwpGenerator is a simple application that takes a web part assembly and generates a dwp file for each web part it finds in the assembly. It uses .Net Attributes to gather other information such as title and description. These are set in the web part assembly. For example, here is the RsExplorer web part's class declaration:

[DefaultProperty("ServerUrl"),
 ToolboxData("<{0}:RsExplorer runat=server>"),
 XmlRoot(Namespace="Bml.RsWebParts"),
 Description("Explore a Reporting Services Server."),
 Title("RS Explorer"),
 PartImage("_WPR_/ReportExplorer.gif")]
public class RsExplorer : RsBasePart, IRowProvider, 
IPostBackEventHandler, IDesignTimeHtmlProvider 
{
    ....
}

The DwpGenerator picks up the title and description from these attributes. The generator doesn't look for any specific class of attributes, so you can create your own. It only looks at the name of the attribute to determine if it should use it. So for the above web part the dwp that gets generated looks like:


<WebPart xmlns="http://schemas.microsoft.com/WebPart/v2">
  <Title>RS ExplorerTitle>
  <Description>Explore a Reporting Services Server.Description>
  <Assembly>Bml.RsWebParts, Version=1.2.0.0,Culture=neutral, 
PublicKeyToken=4fafef280eaa1b9cAssembly>
  <TypeName>Bml.RsWebParts.RsExplorerTypeName>
  <PartImageLarge>_WPR_/ReportExplorer.gifPartImageLarge>
WebPart>

The next part of the automation involves the wppackager tool. This tool is great for generating an installer program for web parts. The only drawback is that you need to copy every file into the same folder for the tool to work. However, it is pretty easy to automate copying files. So here is my post build event command line process for the Bml.RsWebParts:

c:\temp\DwpGenerator.exe "$(TargetPath)" "$(ProjectDir)"
copy "$(ProjectDir)\manifest.xml" "$(TargetDir)"
copy "$(ProjectDir)\*.dwp" "$(TargetDir)"
copy "$(ProjectDir)\images\*.gif" "$(TargetDir)"
copy "$(ProjectDir)\xsl\*.xsl" "$(TargetDir)"
copy "$(ProjectDir)\scripts\*.js" "$(TargetDir)"
"c:\program files\wppackager\wppackager.exe" "$(ProjectDir)\wppackager.xml"
del  "$(TargetDir)\*.xml"
del  "$(TargetDir)\*.gif"
del  "$(TargetDir)\*.dwp"
del  "$(TargetDir)\*.xsl"
del  "$(TargetDir)\*.js"

That is all there is to it. Now I can make changes to my project and when I build it everything is updated.

Google's Mini

Posted in SharePoint | General at Tuesday, February 15, 2005 4:11 AM Pacific Standard Time

Just ran across this today: the Google Mini.

Meet the Google Mini. Designed to help small and medium-sized businesses make the most of their digital assets, the Mini is a hardware and software search appliance that delivers the power and productivity of Google search across your organization’s documents and websites.

The Google Mini:

  • Indexes and searches up to 50,000 documents.
  • Works with over 220 different file formats, including HTML, PDF and Microsoft Office.
  • Can be set up in under an hour and requires minimal ongoing administration.
  • Costs $4,995 for all hardware and software, including a year of support and hardware replacement coverage.

The Google Mini 

This has some interesting implications for SharePoint. While it isn't as feature rich as SharePoint, it is definately a competitor in the information aggregation space.

Update: Sahil Malik  asks how this relates to SharePoint. Perhaps I should have clarified and said SharePoint Portal Server. In order to see the relation take a look at Microsoft's Top 10 Benefits of SharePoint Portal Server 2003.

Find and reuse timely and relevant information from systems and reports, and quickly locate and access documents, projects, and best practices by searching or browsing—all through the portal.

...

The industry-leading search technology in SharePoint Portal Server 2003 enables you to locate files, project plans, and best practices in file shares, Web sites, Microsoft Exchange Public Folders, Lotus Notes, Windows SharePoint Services sites, and databases instead of re-creating the wheel.

Now I'm not saying that the Google Mini is a direct competitor to SharePoint since SP offers much more than search. However, when it comes to information aggregation the Google Mini is clearly a competitor. I've no idea if it can search Lotus Notes or Exchange, but you could probably figure out a way to make it work.

SharePoint Recycle Bins

Posted in SharePoint at Monday, February 14, 2005 10:41 AM Pacific Standard Time

I recently read the MSDN article Add a Recycle Bin to Windows SharePoint Services for Easy Document Recovery by Maxim V. Karpov and Eric Schoonover. It was an interesting read, but I was pretty amazed at the lengths they had to go to in order to get something as simple as a recycle bin. Obviously, this biggest setback was the fact that

events are processed asynchronously. As a result, a registered event sink will only be notified about the document deletion after that document has already been deleted from the SQL Server backend database. As a result, the event sink can't simply copy the document to the recycle bin library because the deleted document no longer exists.

Sounds like a YASPQ (or YASQ if you prefer) to me. So in order to create the recycle bin they Maxim and Eric end up mirroring the document libraries in order to add the recycle bin functionality. While reading the article my mind couldn't help of trying to come up with a simplier method. Here is my own version of a SharePoint recycle bin (note: don't play around with your SharePoint databases if you don't know what you're getting yourself into. I always build and test my scripts out on test servers and sites and this is what I would call a pre-alpha release).

Connect to the _SITE database using Query Analyzer and run the following SQL Scripts.

1) Create the RecycledDocs table which is basically a copy of the Docs table:

-- create the RecycledDocs table
select *
into RecycledDocs
from Docs
where 1 = 0

 

2) Create an instead of trigger on the docs table that will redirects requests to delete documents.

create trigger doc_recycle on docs instead of delete
as
  delete  RecycledDocs
  where  id in (select id from deleted)
  insert into RecycledDocs
  select   id, siteid, dirname, leafname, webid, listid, doclibrowid,
    type, size, metainfosize, version, uiversion, dirty, cacheparseid,
    docflags, thicketflag, charset, timecreated, timelastmodified, 
    nexttolasttimemodified, metainfotimelastmodified, timelastwritten,
    setuppath, checkoutuserid, checkoutdate, checkoutexpires, checkoutsize,
    versioncreatedsincestcheckout, ltcheckoutuserid, virusvendorid,
    virusstatus, virusinfo, metainfo, content, checkoutcontent
  from   Docs 
  where   type = 0 and Id in (select Id from deleted)
  delete  docs
  where  id in (select id from deleted)

 

That is basically it. Of course, there is a lot of functionality that needs to be added from a user perspective, but as an admin you can now restore any document that gets deleted.

I'm not sure how this would affect Microsoft's support of a SharePoint installation, it may or may not. Your RecycledDocs table will also fill up (much like the Windows Recycle Bin) until you manually empty it. I'm hoping to do a longer write up on this along with either a web part or an ASPX page that can manage it.

So there is my simple method of creating a Recycle Bin in SharePoint. It isn't very fancy, but it works. :)

Backing up SharePoint

Posted in SharePoint at Wednesday, February 09, 2005 7:10 AM Pacific Standard Time

I looked into backing up SharePoint awhile back and didn't come away with a good solution. Recently MSDN published a new articled on How to Write a Back Up and Restore Application for SharePoint Portal Server 2003. The first time I looked at the article I skimmed it and didn't find it very useful and wrote it off. However, Brian E. Cooper blogged that it was Worth a Read!

So I went back and downloaded the sample and went though the article and I must admit, it was worth a read. There were several things I did like and some that I did not.

First, I liked the fact that I was able to do a full backup and restore of a SharePoint Portal. I built a Portal on a Virtual Server, backed it up, messed it up, and then restored it. It worked pretty well. Everything was restored except for the WebPart packages. For some reason I had to uninstall/reinstall my web part packages to get them to work. However, this isn't too much of an issue. The main thing was that the content and structure was restored successfully.

Second, the application that comes with the article is a sample application (as referenced by the application name SPSBackupSample). So, what is a SharePoint junkie to do with a sample application except to create a full blown application. I spent much of the day yesterday building my own version of the application written in C#. Everything was going well until I hit the section that backs up the indexes using the CSPSBackupClass. This class is used to backup the indexes to a stream using a method BackupToStream.

In the sample application (which is written in C++) you pass in an IStream pointer. In C# you should be able to do this (if I understand things correctly) by passing the pointer to a UCOMIStream object (using Marshal.GetIUnknownForObject). However, it doesn't seem to matter what I do, I get an exception from the CSPSBackupClass object. So I'm stuck at this point.

I could use the BackupToFile option but I was hoping to keep my sample compatible with the original sample. Any ideas?

Update: In case anyone is interested, here is the exception details that I'm getting:

System.UnauthorizedAccessException: Access is denied.
   at mssctlbm.CSPSBackupClass.BackupToStream(String bstrHostName, Object pStream, String bstrPassword, String bstrSite)
   at BackupSPS.BackupSite.Backup(String location, String backupName)

Update: Ok I was able to get this to work. I ended up writing a C++ console application that calls my C# class library. The problem was that I needed to make these two calls when my program started up:

    CoInitializeEx(NULL,COINIT_MULTITHREADED);
    
    HRESULT hr = CoInitializeSecurity(NULL, -1, NULL, NULL, 
                                      RPC_C_AUTHN_LEVEL_DEFAULT,
                                      RPC_C_IMP_LEVEL_IMPERSONATE,
                                      NULL, 
                                      EOAC_NONE, 
                                      NULL);
 

According to PInvoke.net you shouldn't call CoInitializeSecurity from managed code:

That's because the CLR will almost always call CoInitialize upon startup before execution enters your main method, and CoInitialize will implicitly call CoInitializeSecurity if it hasn't already been called.  Therefore, calling this from managed code will usually return RPC_E_TOO_LATE.

Once I switched to using a C++ app that called my class library everything started working. Once I get this completed I will probably post it up to a new project on source forge in case anyone is interested.

Bill Gates on SharePoint and Infopath

Posted in SharePoint at Wednesday, February 09, 2005 6:56 AM Pacific Standard Time

Mark Bower notes:

Gates went on to add more detail: InfoPath ‘with rich controls, on top of the Avalon runtime’, but also with the ability to ‘project onto classic HTML’.  So InfoPath is likely to evolve into a Forms package that can target thin-client HTML delivery and rich client Avalon delivery.  Look forward to it.

I watched the keynote as well and I thought Bill's response on this “it has taken us awhile to get forms right” was interesting. I didn't pick up on the Infopath connection like Mark did which is an interesting take. A good forms package with thin and rich client delivery would be a good thing. The other interesting thing that stood out to me was that Bill likes SharePoint as noted by Mark Muller:

He also mentioned that he was a big fan of Sharepoint 2003 and the solutions build around it. According to him, Sharepoint provides a foundation for Office applications and system integration. Good to hear this from him ;). Seems like Sharepoint developers won't have to fear to get bored.

For some reason I have this idea in my head that something was said about WinFS and SharePoint merging, but I don't remember if I heard it directly in the keynote or if I just munged some ideas together from all the blogs I read. Anyhow, the keynote is worth watching (hopefully they will post it up in the webcast archive).

 

SqlXml Undeprecated!

Posted in Sql and Xml at Monday, February 07, 2005 6:44 AM Pacific Standard Time

Wow, that was fast. Kent reports that SqlXml is deprecated and minutes later Irwin (the SqlXml PM (or is he the SqlXml MVP?)) responds that only the MDAC portion of SqlXml is deprecated and only for versions of MDAC greater than 2.x after which Kent says that SqlXml may be underappreciated (not unappreciated as I originally thought). All-in-all good news for SqlXml which was not clear from reading the original MSDN article.

The big news though was that Kent was able to get Irwin to post to his blog! :)

(I know, I know, Irwin's blog has been deprecated and we are supposed to read the new XmlTeam blog instead....)

SharePoint, Reporting Services, and Firefox

Posted in Reporting Services | SharePoint | General at Monday, February 07, 2005 4:46 AM Pacific Standard Time

Mads created a GotDotNet workspace for tracking issues with SharePoint and non-IE browsers. I installed Firefox some time ago but but it aside since I didn't appear to do NTLM authentication which is needed for SharePoint access. One of the first posts in the form there is TIP: Use current Windows credentials in Firefox (warning: we are talking about GotDotNet so the link to the post may be unusable about 50% of the time while the GotDotNet crew does some “maintenance“). The tip explains how to enable NTLM in Firefox. However, it didn't work for me and after some googling I found this post which adds another setting to the mix but didn't solve the problem for me either. Turns out I had to add the server names to the bypass proxy list for it to work.

SharePoint seems to work except for ActiveX controls, which I see Mads already posted about. The most interesting thing to me though was checking out Reporting Services from Firefox which is pretty much unusable. Very ugly. I will have to look into seeing if I can make them work using the RsWebParts (which I'm still working on version 1.2 (parameters are a pain)).

SqlXml Deprecated

Posted in Sql and Xml at Monday, February 07, 2005 4:03 AM Pacific Standard Time

SqlXml is on the “deprecated” data technology list [via Kent]:

These components are still supported in the current release of MDAC, but might be removed in future releases. Microsoft recommends that when you develop new applications, you avoid using these components. Additionally, when you upgrade or modify existing applications, remove any dependency on these components.

Even though SqlXml is on this list, we read:

This component [SqlXml] is not being deprecated, but it is being removed from future MDAC releases. Current and later versions of this product are available as Web downloads. SQL XML will be available on the 64-bit Windows operating system. [Emphasis mine]

Ok, got that? It is on the list of deprecated technologies, but it is not being deprecated, so this leaves it as just unappreciated. :)

Update: I had to update this post since I confused deprecated with depreciated. :)

RssBandit 1.3 Beta

Posted in General at Monday, February 07, 2005 3:53 AM Pacific Standard Time

I upgraded to the latest Beta of RssBandit this morning, and it already saved me a lot of time. The new skim mode is great: it shows all the unread posts for a feed and marks them all as read when you move to the next feed. Makes reading blogs much quicker on Monday morning. My only issue with this new mode is how to keep one item as unread. For instance, I wanted to go back and read this post on the logging block extentions of Enterprise Library but when I would mark it as unread it would get set back to read when I clicked on another feed.

Other than that everything works great! Thanks!

My BlogMap

Posted in General at Monday, February 07, 2005 3:47 AM Pacific Standard Time

Here is my BlogMap (via TestDriven.Net, mad's thoughts, and Mark Bower (after reading all three I had to check it out :)):

my blogmap

Searching the blogmaps I find there are 3 other blogs located near me: Greg's Cool [Insert Clever Name] of the Day, e-piphany, and VbAspCoder.

Page.RegisterStartupScript

Posted in Reporting Services | SharePoint | ASP.Net/Web Services at Thursday, February 03, 2005 3:46 AM Pacific Standard Time

Note to self:

The Page.RegisterStartupScript just outputs the text you pass to the method and doesn't add script tags. If you don't add script tags your text will be output at the bottom of every page (which is embarrassing). Why can't I remember this....

P.S. - In case you were wondering, I just discovered this is a bug in the RsWebParts version 1.1 which will be fixed in 1.2.

P.S.S. - I hear this is “fixed” in ASP.Net 2.0...

The Cost of Comments

Posted in General at Thursday, February 03, 2005 2:42 AM Pacific Standard Time

I was originally going to post this in the comments on Scott's post on the Worth(lessness) of CAPTCHAs, but my comments turned into a post.

I think CAPTCHAs need to be looked at from a cost perspective versus a technical perspective to see why they are effective.

For the blogger, the CAPTCHA has relatively no cost.

For the average user, the CAPTCHA has a very small cost: a few nanoseconds extra to download the page and a brain cycle or two to read the image (if they post a comment).

For the comment spammer, the cost is relatively high. Either they have to write a complicated program that takes quite a bit of CPU power to crack each CAPTCHA or they have to "pay" someone to read it for them.

Once you raise the price of posting comment spam, even a little, it takes away the cost/benefit gain of the comment spammer. They lose when it costs them to post a comment. Even if they still post comments, it now costs them more and so they will make less.

I like Scott's authentication/moderation idea, but if you read the inverview with the comment spammer on the Register it is not enough since registration can be automated. I think authentication/moderation pushes the cost to the wrong people: the blogger has to moderate (high cost), the user has to register (medium cost), while the spammer just automates registration (low cost).

Rendering Regret

Posted in Reporting Services | SharePoint at Tuesday, February 01, 2005 2:42 AM Pacific Standard Time

From Barry's Blog via Serge van den Oever on RenderWebPart:

When things go the way you expect, you want to render an entire batch of HTML. However, you don't want to render any of that HTML in the case of an exception when the logic in your Web Part determines that it should display an error message instead. Things can really get ugly if you render the opening tags for an HTML table and do not properly close them due to an exception. This can upset the high-level rendering logic of the page as well as other Web Parts.

The solution to this problem is to use some technique that allows you to write HTML into a buffer and then send it all in a single batch once you know that all the rendering logic has executed successfully. You can create a string buffer using an instance of the System.Text.StringBuffer class. You can then create an instance of the HtmlTextWriter class and use that to render HTML into the buffer.

Barry then gives some code that will allow you to buffer your HTML output which lets you manage your exceptions gracefully. I liked the idea so much that I've already added it to the latest version of the RsWebParts (version 1.2 which I'm still trying to tidy up for release on sf.net). Since the RsWebParts all inherit from the RsBasePart I simply did this:

        /// 
        /// Renders the HTML contents of the webpart. 
        /// The base webparts calls this and buffers the output in order to 
        /// prevent the webpart causing the SharePoint page to fail. This code
        /// was take from Barry's Blog (http://www.barracuda.net/barrysblog.aspx?Date=10/22/2004)
        /// 
        /// HtmlTextWriter to writer content with.
        protected override void RenderWebPart(HtmlTextWriter output) 
        { 
            // create buffer for output 
            StringBuilder buffer = new StringBuilder(10240);
            StringWriter InnerWriter = new StringWriter(buffer);
            HtmlTextWriter BufferWriter = new HtmlTextWriter(InnerWriter); 
            try 
            { 
                // call method to write HTML to buffer
                WriteWebPartContent(BufferWriter);
                // write HTML btach back to browser
                output.Write(buffer);
            }
            catch(Exception ex) 
            { 
                output.Write("Web Part Error: " + ex.Message); 
            } 
        }
        /// 
        /// The webpart must call this method to render its content instead of
        /// overriding RenderWebPart.
        /// 
        /// HtmlTextWriter to writer content with.
        protected abstract void WriteWebPartContent(HtmlTextWriter output);

Next I just renamed the RenderWebPart methods of each web part to WriteWebPartContent and now I have buffered output! Very cool.

Barry has some other great articles on his blog and his company offers SharePoint training. My boss and one of my coworkers took the class and both learned quite a bit (I didn't take the class because I'm not really the SharePoint admin, just a web part developer).