Overstanding the SO-Aware Test Workbench Test Types (part 1: Unit & Simple Load Testing)

Introduction

SO-Aware, allows you to create unit tests for WCF SOAP, and WCF REST based web services. These tests can be executed using the standard web based user interface that SO-Aware provides. The tests can also be executed using a PowerShell script. There is also a automated windows service that runs that can schedule a test to run every so often. The results of these tests are stored inside the SO-Aware repository.

After utilizing these testing features of SO-Aware, it became evident from our customers that there should be a Client Desktop form of this testing. Where a Client application can run these tests, and the results of these tests can be exported out into a format for inclusion into a reporting system like SQL Reporting services, where the results can be gathered and displayed using other tools.

The SO-Aware Test Workbench (TWB) is this client side application. TWB contains many different types of tests that a developer, architect, and IT administrator can use to stress test services in their enterprise. TWB, supports executing single unit tests, just like inside SO-Aware, Simple Load Tests, Linear Load Tests, Burst Load Tests, and SawTooth Load tests.

Each different type of test has a unique design pattern and purpose to allow you to determine different metrics and derive different information from the results. In these next series of blogs, I will explain what each test is designed to do, and explain in detail about each of the configurable parameters. I will also explain a few ways of how to interpret the results.

In this first posting, let’s discuss two tests: Unit Test and Simple Load testing. Before we delve into the specifics, I want to provide a baseline for comparison. If you want to do these same type of tests, I’d suggest you go and download SO-Aware and the SO-Aware Test Workbench now.

Just so that you know, all tests were performed on a VMWare virtual machine. The VM image was configured to have: 

  • 1 CPU with 8 cores (Intel i7 2GHz)
  • Running on a 256 GB SSD (100GB allocated to image on single drive: C:\)
  • RAM 4GB
  • Networking Shared
  • Windows Server 2008 R2 (SP1)
    • Web Role
    • AppFabric
    • Active Directory Domain Controller
    • File Application Server Role
  • SQL 2008 R2 (Installed and Running)
  • SharePoint 2010 (Installed and Running)
  • BizTalk Server 2010 (Installed not running)
  • VS.NET 2010 (SP1)

Services used in these tests are:

    1. SO-Aware Service Repository (REST registered endpoint)
    2. People Finder Service (AppFabric WCF Service as found in the SO-Aware SDK)

Now with all the logistical settings out the way… let’s continue on.

Unit Testing

The Unit testing feature of TWB allows you to run singular tests configured inside SO-Aware and export its results using PDF and Excel formats for inclusion into other reporting tools.

This test is designed to test the functionality of the service operation. It is a single test that allows you to obtain its execution time, success/fail status, and basic operation. B y right-clicking on any test, you can select Run Test and it will display the screen below with an option to change the registered message in SO-Aware, restore the original message and Run the single unit test.

image

Simple Load Test

The Simple Load test is designed to allow you to test the singular unit tests over and over again for a certain period of time. The concept here is that after you’ve run the unit test a few times, you should have a good feeling of how long that test runs. Now if you wanted to test that unit test with concurrency, that is… let’s say 10 of the same tests running at the same time for over 60 seconds, this is where the Simple Load test comes into play.

You can see the execution time, success/fail status and basic operation. Not only does the Simple load test allow you to test concurrency, you can also test redundancy to get an average execution time on the current hardware and software design, by executing it over and over again.

The parameters of the Simple Load test are:

Concurrency – how many threads to execute a single test on for concurrent processing

Delay (ms) – how long to wait for the concurrent tests to run. This value should be set high enough to allow the other concurrent test to load up and execute without affecting the previous tests. However setting this value to a lower value would simulate a constant (albeit linear) stress factor of concurrent tests that load on top of concurrent tests before the previous tests have had a chance to complete.

Duration (sec) – the number of seconds to continue running the concurrent tests.

Number of tests – the number of iterations to run the complete duration of concurrent tests.

clip_image002

In the above test, I executed the Get Services REST endpoint for a SO-Aware instance I have installed. Below are the first five results from the Excel exported results workbook.

Making sense out of the first five results in the table:

Second

Count

Average

Errors

Passed

Thread Count

0

37

89.81081081

0

37

10

1

38

60.92105263

0

38

10

2

42

39.61904762

0

42

10

3

38

46.05263158

0

38

10

4

42

50.26190476

0

42

10

5

42

43.80952381

0

42

10

Basically what this boils down to is this, up to the first second, we had 37 tests run concurrently, and out of all these tests, it took an average of .0089 seconds (89.81 ms) for all 37 tests to complete. This yields an average of 1 test every 2.42 milliseconds. During the 89ms, there was a max count of 10 threads that occurred during the processing of these tests. Another way of looking at it is that every 3.7 tests a new thread may have been spawned. – Ok Great…

Now within the next second (up to the 2nd Second), we had 1 more test get crammed into the process to total 38 test being run concurrently. It took 61 seconds, on average. This is a speed increase of 30%.

On the next second (up to the 3rd second), we had 4 more test get added totaling 42 tests. These test ran the quickest at a blazing 40ms. That’s an average of 1 test every 0.94 ms which is .000094 seconds. Remember 2GHz is 2 billion ticks per second, and depending on the CPU architecture (# of Cores/RISC/CISC etc) different sets of instructions may execute on 1 tick (clock cycle).

On the fourth and fifth seconds the median appears to have been reached ranging from 43 – 53 milliseconds easily of 38 -42 tests, while taking the median of those ranges yields 50ms per grouping of 40 tests.

Simple Load tests help you determine how fast your service/operation will run with the same operation running at the same time. It will also allow you to see the fastest and slowest times on a given hardware design. This test will also allow you to see how many threads will yield performance degradation for a given operation. Hopefully this is enough to get you started on executing Unit Tests and Simple Load tests using the TWB. In the next posting I will talk about Linear Load Tests, so start testing your services today!!!

Advertisements

Monitoring a BizTalk WCF Receive Location with SO-Aware

As a quick start, for those that don’t know, SO-Aware is a service metadata repository, very similar to UDDI, except without the complexity, all based on a REST-ful architecture. If you do know, you can skip to Monitoring a BizTalk WCF Receive Location section. (UPDATE: you can also view a video on the webinar explaining and demoing this here.) It supports registry of SOAP, REST, and OData based services, written using Microsoft’s WCF technology stack, as well and Java based web services. However, that’s not all. SO-Aware has five major capability buckets: Centralized Configuration, Service Testing, Dependency Modeling, Service Cataloging and last but not least Activity Monitoring.

Centralized Configuration

Centralized Configuration allows you take full advantage of Microsoft’s WCF based services, by containing a central repository database for all WCF Configurations. This feature allows you to allows you to store and retrieve all information pertaining to WCF configurations. You can retrieve endpoints, bindings, behaviors, Url’s, security information, just to name a few. You can also dynamically change your bindings and configurations so that all you existing services can point to this central location for your configuration.

Service Testing

Service Testing allows you to test registered services. One derived idea about Service Cataloging and Centralized Configuration, is that if SO-Aware knows about your service and communication protocol, then testing becomes simple. It’s simple because if you registered any security, binding, message type format information about the service, then all SO-Aware needs to do is query this information and build the communication stack and messages to send to the service for testing. What better tool to test than the one that understands how to communicate with the Service.

Dependency Modeling

Dependency Modeling allows you to build a diagram of service versions, and the dependencies of the service version. Thus if you needed to see which services depended each other, you have a view into this.

Service Cataloging

Service Cataloging allows you to store and retrieve custom metadata about Services, Service Versions, and environment details. Using these features, allow you to query the catalog for information about services, and service versions dynamically.

Activity Monitoring

Activity Monitoring allows you see tracked events and aggregations about registered services, such which operations were invoked, and how many services were sent message over a period of time, and many other dimensions and measurements.

What is BizTalk Server?

BizTalk Server is one of Microsoft’s flagship products for System Integration, message brokering, and service bus designs. It comes packed with a collection of tools and libraries that extend normal middleware Server capabilities by supporting a loosely coupled and dynamic messaging architecture. It functions as middleware server product that provides rapid mediation between services, endpoints, line of business systems and their consumers. Enabling maximum flexibility at run time, BizTalk Server simplifies loosely coupled composition of service endpoints and management of service interactions. There are many components that allows BizTalk Server solutions to be agile in its execution. For example, BizTalk Server contains Adapters which are connectors into different systems and protocols. It also contains Maps, which are Xml stylesheet transformation documents that have been precompiled to transform documents from one format into another. There are Pipelines which are components that allow you to translate and convert one file into another such as a Adobe PDF document into a comma separated text file. There are also Schemas which support validation of different messages, both xml and non xml. Lastly there are Orchestrations which are graphically modeled business processes that support long running transactions, compensation models, error handling, correlation, and custom logic flow.

Adapters in BizTalk are one of the key components for communicating with BizTalk Server. There are sending adapters and receiving adapters. There’s an adapter for practically every protocol that exists such as ftp, http, smtp, tcp, ftps (new in 2010), Msmq and others. There are application type of adapters that adhere to the rules and policies of Line of Business applications, such as SharePoint, SAP, Oracle E-Business Suite, and other CRM applications. There are even adapters for databases such as SQL, Oracle, and DB2. When developing solutions with BizTalk Adapters are usually configured statically and there’s little room for agility. Thus BizTalk supports the notion of Dynamic adapters for sending data. These adapters are configured at runtime, which will yield great agile solutions, at the sacrifice of High performance. There are also Dynamic adapters which require turn static BizTalk Server solutions into dynamic ones, along the Microsoft ESB toolkit, which can make everything in the BizTalk arena agile.

BizTalk Server also contains Monitoring capabilities. Every message and process that is run within BizTalk Server can be monitored using its internal Tracking system. As well as using a more open consumer business process driven ones entitled BAM, business activity monitoring. BizTalk has the capability to automatically monitor ports, receive locations, pipelines, maps, validation attempts, business rules, orchestrations and adapters. Using BAM, architects and developers can create custom monitoring scopes, and key performance indicators, for tracking and monitoring specific business requirements and processes. This leads us to why SO-Aware and what benefits can we discover when using SO-Aware to monitor BizTalk.

Monitoring using SO-Aware

BizTalk Server contains WCF Adapters, which provide limitless range of possibilities for communication protocols, and access to various endpoint systems. What I’m trying to say here is that through the use of WCF Adapters, any and every system can potentially be accessed and communicated to. To support this claim, Microsoft released the Microsoft WCF LOB Adapter SDK. This SDK is a framework that makes it easy for developers to create “Adapters” or better put, wcf channels to connect any type of system. These channels can be used with BizTalk, and regular .Net Applications. To go along with the SDK, Microsoft released some Production ready, sample Adapters based off this framework, called the Microsoft BizTalk Adapter Pack . Here’s the interesting kicker… The BizTalk Monitoring architecture was not updated sufficiently to handle these new additions to the BizTalk arena.

To Monitor these WCF Adapters, you have a few options:

  1. Use the WCF Message logging and Activity Tracing capabilities, which can log, messages, operations, actions, instances, events and everything related WCF services. To View these traces and logs by default, you need to use the WCF Trace Viewer application provided with the .Net Framework SDK. This is your standard WCF Tracing capability. It is not tied to BizTalk in any shape, form or fashion. One nice thing about using the WCF Message logging and tracing is the ability to control where the logs and traces are written, how much data is written, and the ability to view this information RAW as it’s being recorded. Of course each pro has a con associated with it. The downside is that to turn on WCF tracing requires you to modify the BizTalk configuration file, and it will record messages for ALL WCF Adapters – Send and Receive. Thus if you have and large amount of Adapters being used, your traces can become convoluted very quickly, and fill up your trace/log repository very fast. Along with this, the traces and logs may be a little cryptic by default, unless you create your own logger and format the results accordingly.
  2. Use BAM WCF Interceptors to log custom KPI’s to BAM tables, and create a custom view to see the results. This second option uses the only real updated portion of BizTalk to handle these new WCF Adapter features: WCF Interceptors. The main issue here is that configuring the WCF Interceptor tracking profile (WCF Interceptor configuration file), is a very tedious and complex process. As of this writing there is no graphical User interface for configuring it, like the BAM model in Excel. Thus this process is prone to trial and error. It’s a very powerful enhancement, however just not feasible in the quick development turn around times needed in Today’s market. Let’s put it this way, it took me a month to get the very basics down, after having to deal with reverse polish notation, and Interceptor configuration files. However once you understand it, it too has a limitless range of possibilities. Thus for this option, I’d say a big *PRO*, with the downside of complexity, time to configure and brittleness.
  3. Use the WCF Extension model through behaviors (which is what BAM WCF Interceptors is, an Endpoint Behavior) to create a custom Behavior to monitor anything and everything related to WCF Services/LOB Adapter channels. The third option, is the exact same as the second option, the only difference is, you must build your own tracking logic, message logging, tracing and etc. Thus the time taken to invest in this option will outweigh both the previous two option multiplied together, in the long run, you will have complete control over it’s complexity, and brittleness.

We at Tellago Studios, have made it very easy for you to monitor these WCF Adapters. We followed the 3rd option of creating our own custom WCF Service Behavior. This behavior supports logging messages, actions, operations, and also providing basic metrics such as number of operations called, number of errors over a specific period of time. When you configure the usage of the behavior for a WCF Receive location, we track these details, and provide a meaningful user interface to see the results.

Inside the SO-Aware Portal, Clicking on the Monitoring tab reveals this content:

Figure 1: Monitoring BizTalk Receive Locations

Figure 2: Viewing a BizTalk Orchestration published as WCF Receive Location Service

Figure 3: Request/Response

Figure 4: Request Message Body and Header Details

 

So How Easy is it for you to monitor the BizTalk WCF Adapters?

Well the answer is as easy as adding a Service Behavior to the service. BizTalk Server WCF Adapters, provide two ways of adding a service behavior to a service depending on where the service is being hosted. BizTalk Server WCF Adapter Services can be hosted inside of the BizTalk Process: BTSNTSVC.exe, or inside an isolated host instance: IIS (w3wp.exe). These hosted services only apply to BizTalk Receive locations, as BizTalk WCF Send adapters don’t host services, they just create Channel factories for sending messages to other endpoints.

Basic Steps:

  1. Install the SO-Aware Monitoring Behavior.
  2. Register your WCF Receive Location inside SO-Aware; register the service name, and service version, pointing to the correct location of the WSDL
  3. Turn on a tracking profile for the registered service version such as VerboseSoap

Figure 5: Registered BizTalk Orchestration published as a WCF Service Receive Location in SO-Aware

  1. Verify the SO-Aware Windows Host is started and running:

Figure 6: SO-Aware Windows Host

 

  1. Configure the Inprocess or Isolated Process Receive Location to use the SO-Aware Monitoring Behavior, and you’re done.

     

Installing the SO-Aware Monitoring Behavior

  1. The monitoring behavior can be found in the SO-Aware SDK.
  2. To install, GAC the Tellago.ServiceModel.Governance.Monitoring.Client, Version=1.1.0.0, Culture=neutral, PublicKeyToken=68f3f79a5464509d
    assembly; This assembly can be found inside the Reference Assemblies folder (C:\Program Files (x86)\Tellago Studios\SO-Aware\SDK\Reference Assemblies – by default).
  3. Register this behavior inside the Machine Configuration file (Both 32bit and 64bit, if on a 64bit machine).
  4. You can register the behavior with this syntax inside the <behaviorExtensions> section. (Note: this must be inside both 32bit and 64bit machine configuration files for 64bit systems).

 

Web.Config – Behavior Extension Registration

 

<add
name=SOAwareServiceMonitoring
type=Tellago.ServiceModel.Governance.Monitoring.MessageTrackingBehaviorElement, Tellago.ServiceModel.Governance.Monitoring.Client, Version=1.1.0.0, Culture=neutral, PublicKeyToken=68f3f79a5464509d />

 

 

 

Adding the SO-Aware Monitoring Behavior To an InProcess BizTalk WCF Receive Location

  1. Open up the BizTalk Administration Console, navigate to the registered WCF Receive Location. Change the WCF Adapter to use the WCF –Custom adapter.
  2. Configure the WCF Custom adapter to use the same binding and settings as the previous WCF Receive location.
  3. Click on the Behaviors and Add the SoAwareMonitoring Behavior.
  4. Configuring the Behavior is simple, set the serviceVersion and soAwareConfigurationCategory properties.
  5. The serviceVersion property is set to a registered Service Name appended with the “(1.0)” version number such as this:

Figure 7: Configuring the SO-Aware Service Monitoring Behavior

  1. The soAwareConfigurationCategory property is the name of the environment, if any such as Production or QA.
  2. Last, open up the BizTalk configuration file: BTSNTSVC.exe.config and add these two xml configuration entries:

 

BTSNTSVC.exe.config – Register Service Repository section inside Config Sections

 

<configSections >

<section
name=serviceRepository
type=Tellago.ServiceModel.Governance.ServiceConfiguration.ServiceRepositoryConfigurationSection, Tellago.ServiceModel.Governance.ServiceConfiguration, Version=1.1.0.0, Culture=neutral, PublicKeyToken=68f3f79a5464509d/>

</configSections>

 

 

 

BTSNTSVC.exe.config – Configure serviceRepository section to point to SO-Aware URL

 

<serviceRepository
url=http://localhost:8090/So-Aware/ServiceRepository.svc>

</serviceRepository>

 

 

 

  1. Save and you’re done.

 

Adding the SO-Aware Monitoring Behavior To an Isolated Process such as IIS for a BizTalk Receive Location

  1. Open up the web.config file for the IIS Virtual Directory hosting the BizTalk Receive Location.
  2. Add these sections to your web config file:

 

web.config – Register Service Repository section inside Config Sections

 

<configSections >

<section
name=serviceRepository
type=Tellago.ServiceModel.Governance.ServiceConfiguration.ServiceRepositoryConfigurationSection, Tellago.ServiceModel.Governance.ServiceConfiguration, Version=1.1.0.0, Culture=neutral, PublicKeyToken=68f3f79a5464509d/>

</configSections>

 

 

 

web.config – Configure serviceRepository section to point to SO-Aware URL

 

<serviceRepository
url=http://localhost:8090/So-Aware/ServiceRepository.svc>

</serviceRepository>

 

 

 

  1. Add the Behavior to the WCF Service:

 

web.config – Register Service Repository section inside Config Sections

 

<serviceBehaviors>

<behavior
name=ServiceBehaviorConfiguration>

<serviceDebug
httpHelpPageEnabled=true
httpsHelpPageEnabled=false
includeExceptionDetailInFaults=true />

<serviceMetadata
httpGetEnabled=true
httpsGetEnabled=false />

         <SOAwareServiceMonitoring
serviceVersion=BtsOrchSample(1.0) />

</behavior>

</serviceBehaviors>

 

 

 

  1. Configuring the Behavior is simple, set the serviceVersion and soAwareConfigurationCategory properties to the registered services inside SO-Aware.
  2. Save and you’re done.

     

Using SO-Aware with BizTalk 2010 and ESB 2.1 part 2 of 2

As a quick start, for those that don’t know, SO-Aware is a service metadata repository, very similar to UDDI, except without the complexity, all based on a REST-ful architecture. If you do know, you can skip to using SO-Aware with BizTalk 2010 and ESB 2.1 section. It supports registry of SOAP, REST, and OData based services, written using Microsoft’s WCF technology stack, as well and Java based web services. However, that’s not all. SO-Aware has five major capability buckets: Centralized Configuration, Service Testing, Dependency Modeling, Activity Monitoring and last but not least Service Cataloging.

Centralized Configuration allows you take full advantage of Microsoft’s WCF based services, by containing a central repository database for all WCF Configurations. This feature allows you to allows you to store and retrieve all information pertaining to WCF configurations. You can retrieve endpoints, bindings, behaviors, Url’s, security information, just to name a few. You can also dynamically change your bindings and configurations so that all you existing services can point to this central location for your configuration.

 
 

Service Testing allows you to test registered services. One derived idea about Service Cataloging and Centralized Configuration, is that if SO-Aware knows about your service and communication protocol, then testing becomes simple. It’s simple because if you registered any security, binding, message type format information about the service, then all SO-Aware needs to do is query this information and build the communication stack and messages to send to the service for testing. What better tool to test than the one that understands how to communicate with the Service.

 
 

Dependency Modeling allows you to build a diagram of service versions, and the dependencies of the service version. Thus if you needed to see which services depended each other, you have a view into this.

 
 

Activity Monitoring allows you see tracked events and aggregations about registered services, such which operations were invoked, and how many services were sent message over a period of time, and many other dimensions and measurements.

 
 

Service Cataloging allows you to store and retrieve custom metadata about Services, Service Versions, and environment details. Using these features, allow you to query the catalog for information about services, and service versions dynamically.

 
 

Which leads us into the next discussion, using SO-Aware with BizTalk 2010 and ESB 2.1. The first part of this post covered how SO-Aware can use its Service Cataloging, Centralized Configuration, Dependency Modeling, and Dynamic Resolution capabilities to enhance BizTalk server 2010. This post covers how we can use SO-Aware with BizTalk’s ESB 2.1toolkit.

What is BizTalk ESB 2.1?

The BizTalk ESB Toolkit is a collection of tools and libraries that extend BizTalk Server capabilities of supporting a loosely coupled and dynamic messaging architecture. It functions as middleware that provides tools for rapid mediation between services and their consumers. Enabling maximum flexibility at run time, the BizTalk ESB Toolkit simplifies loosely coupled composition of service endpoints and management of service interactions. The ESB Toolkit contains many components that allows BizTalk Server solutions to be agile in its execution. For example, BizTalk Server contains Adapters which are connectors into different systems and protocols. It also contains Maps, which are Xml stylesheet transformation documents that have been precompiled to transform documents from one format into another. There are Pipelines which are components that allow you to translate and convert one file into another such as a Adobe PDF document into a comma seperated text file. There are also Schemas which support validation of different messages, both xml and non xml. Lastly there are Orchestrations which are graphically modeled business processes that support long running transactions, compensation models, error handling, correlation, and custom logic flow.

Adapters in BizTalk are one of the key components for communicating with BizTalk Server. There are sending adapters and receiving adapters. There’s an adapter for practically every protocol that exists such as ftp, http, smtp, tcp, ftps (new in 2010), Msmq and others. There are application type of adapters that adhere to the rules and policies of Line of Business applications, such as Sharepoint, SAP, Oracle E-Business Suite, and other CRM applications. There are even adapters for databases such as SQL, Oracle, and DB2. When developing solutions with BizTalk Adapters are usually configured statically and there’s little room for agility. Thus BizTalk supports the notion of Dynamic adapters for sending data. These adapters are configured at runtime, which will yield great agile solutions, at the sacrifice of  High performance. , naturally. Using Dynamic adapters requires knowledge of which properties must be set for a particular adapter, and custom code inside of Orchestrations and/or custom pipeline components. This is where some aspects of the ESB 2.1 toolkit come in. The ESB 2.1 toolkit contains Adapter Providers which can dynamically configure adapters for use in a agile solution. All the Adapter provider needs is the runtime configuration values.

The ESB toolkit utilizes a component called Resolvers which feed the Adapter provider the runtime configuration values so that the Adapter Provider can dynamically configure the Dynamic Adapter at runtime. Resolvers can not only discern which Adapter runtime configuration to use, but also which Map, Pipeline, Schema, and Orchestration. Resolvers are simply a .Net component which build up a dictionary collection of entries which then tell other ESB components what values to use for the various BizTalk Components mentioned earlier.

Using Resolvers dramatically changes the way architects and developers can build, design and implement BizTalk Server 2010 solutions. As a quick example, most designs of BizTalk prior to the ESB toolkit hard coded, and statically configured maps. A Map contains a Source schema associated with an xslt transformation, and a destination schema. If either the source, destination or transformation stylesheet changes, it can potentially render the map invalid and cause all kinds of errors. Using a more agile design allows architects to use different versions of maps at runtime, and even dynamically pre-compile a auto generated map all at runtime.

The ESB Toolkit not only contains Resolvers, but also Routing Services, Transformation Services, Itinerary Services, Exception Management services and a  portal. With  exception or the Error portal, Resolvers play an integral role, communicating with the Routing Services, Itinerary Services and Transformation Services. All this to say Resolvers are the key compoonent of the ESB toolkit. The ESB Toolkit comes with many Resolvers: Static, Business Rules, XPath, UDDI, WSMEX, and LDAP, all of which allow for the dynamic resolution of Maps, and adapter providers.

Which leads us to SO-Aware…. SO-Aware provides an ESB Resolver to allow you to dynamically resolve Adapter providers, Maps, schemas, pipelines, orchestrations, WCF Service: Bindings, Versions, Behaviors and other configurations. Inside the SDK, you can find samples on how to use the SO-Aware resolver which show how to resolve Adapter providers, WCF Service: Bindings, Versions, and Behaviors. The SO-Aware resolver and database can be easily extended to resolve Maps, schemas, pipelines and orchestrations.

 
 

Using the SO-Aware ESB Resolver

 
 

After installing the SO-Aware SDK,  you can find the Solution and samples inside the C:\Program Files (x86)\Tellago Studios\SO-Aware\SDK\Samples\BizTalkESB\vs.net2k10\. You can install the SO-Aware resolver by running the setup on a BizTalk 2010 ESB 2.1 system. The Install can be found here: C:\Program Files (x86)\Tellago Studios\SO-Aware\SDK\Samples\BizTalkESB\vs.net2k10\Tellago.SOA.ESB.Extensions.Solution\Tellago.SOA.ESB.Extensions.Setup\Debug\setup.exe .

After installing the SO-Aware  ESB Resolver, you can use the Resolver inside of Business Rules, Itineraries, Orchestrations, or pipeline components, basically anywhere resolvers can be used. Below is an example of how to use the SO-Aware resolver inside an Itinerary.

Steps.

1. Create a new BizTalk 2010 Itinerary inside a C# Library project

2.

3. Add An On-Ramp, Off-Ramp, and 2 Itinerary Service steps into the Itinerary

4. Right click on the Resolver on one of the Itinerary Service Steps and add a new Resolver:


5. In the properties windows select the Tellago SO-Aware ESB Resolver


6. Name the resolver SoAware, and set the properties:


The SO-Aware Resolver contains these properties:

Endpoint Name: –This is the Name of the endpoint registered inside the SO-Aware Repository

Environment: This is the configuration category such as “Production” “Q&A” “Testing” that the service version is registered under.

Operation Name- This is the name of the operation of the Service version that is registered inside SO-Aware

Service Name: This is the name of the Service You would like to retrieve its configuration from.

SO-Aware URL – this is the URL location of where the SO-Aware Service Repository  Service is installed.

Version – This is the version number of the Service registered inside SO-Aware.

7. Configure the remaining Itinerary steps, choosing Receive Port/Location, Dynamic Send Port for the Off Ramp, and adding the Itinerary connectors, completing the Itinerary design.


8. This itinerary is a simple Itinerary that resolves the People Finder  version 1.0 GetAllRegisteredPeople Operation.

Note: The BizTalk 2009 Version of this Resolver is a more generic example, whereas the BizTalk 2010 version contains more functionality.

Using SO-Aware with BizTalk 2010 and ESB 2.1 part 1 of 2

As a quick start, for those that don’t know, SO-Aware is a service metadata repository, very similar to UDDI, except without the complexity, all based on a REST-ful architecture. If you do know, you can skip to using SO-Aware with BizTalk 2010 and ESB 2.1 section. It supports registry of SOAP, REST, and OData based services, written using Microsoft’s WCF technology stack, as well and Java based web services. However, that’s not all. SO-Aware has five major capability buckets: Centralized Configuration, Service Testing, Dependency Modeling, Activity Monitoring and last but not least Service Cataloging.

Centralized Configuration allows you take full advantage of Microsoft’s WCF based services, by containing a central repository database for all WCF Configurations. This feature allows you to allows you to store and retrieve all information pertaining to WCF configurations. You can retrieve endpoints, bindings, behaviors, Url’s, security information, just to name a few. You can also dynamically change your bindings and configurations so that all you existing services can point to this central location for your configuration.

 

Service Testing allows you to test registered services. One derived idea about Service Cataloging and Centralized Configuration, is that if SO-Aware knows about your service and communication protocol, then testing becomes simple. It’s simple because if you registered any security, binding, message type format information about the service, then all SO-Aware needs to do is query this information and build the communication stack and messages to send to the service for testing. What better tool to test than the one that understands how to communicate with the Service.

 

Dependency Modeling allows you to build a diagram of service versions, and the dependencies of the service version. Thus if you needed to see which services depended each other, you have a view into this.

 

Activity Monitoring allows you see tracked events and aggregations about registered services, such which operations were invoked, and how many services were sent message over a period of time, and many other dimensions and measurements.

 

Service Cataloging allows you to store and retrieve custom metadata about Services, Service Versions, and environment details. Using these features, allow you to query the catalog for information about services, and service versions dynamically.

 

Which leads us into the next discussion, using SO-Aware with BizTalk 2010 and ESB 2.1. First, for those that don’t know BizTalk and ESB 2.1, here’s a quick blurb from Microsoft’s web site on each technology:

  What is BizTalk?

BizTalk Server is Microsoft’s Integration and connectivity server solution. A mature product on its seventh release, BizTalk Server 2010 provides a solution that allows organizations to more easily connect disparate systems. Including over 25 multi-platform adapters and a robust messaging infrastructure, BizTalk Server provides connectivity between core systems both inside and outside your organization. In addition to integration functionality, BizTalk also provides strong durable messaging, a rules engine, EDI connectivity, Business Activity Monitoring (BAM), RFID capabilities and IBM Host/Mainframe connectivity.

What is BizTalk ESB 2.1?

The BizTalk ESB Toolkit is a collection of tools and libraries that extend BizTalk Server capabilities of supporting a loosely coupled and dynamic messaging architecture. It functions as middleware that provides tools for rapid mediation between services and their consumers. Enabling maximum flexibility at run time, the BizTalk ESB Toolkit simplifies loosely coupled composition of service endpoints and management of service interactions.

Using SO-Aware with BizTalk 2010

One of the main features of BizTalk Server is its adapter component design. BizTalk uses the Adapters to communicate to various systems, such as Line of Business systems, Databases, Mainframe Applications, and most notably Web Services. To be more specific on web services, BizTalk contains WCF adapters that facilitate communication to different types of web services: REST/ODATA and SOAP.  The more involved BizTalk becomes in integration solutions, the more WCF adapter, and Web services are used to compliment or run the entire solution.

Maintaining such as solution can easily become a maintenance nightmare. When a particular web service changes, or is versioned there are no tools that help keep track of this. Usually trying to remember all the configuration values are impossible, and the solution becomes brittle to the slightest change. SO-Aware easily alleviates this by Service Cataloging, Centralized Configuration, Dependency Modeling, and Dynamic Resolution.

With Service Cataloging, SO-Aware you can register the WCF Adapter configurations for Receive Locations (BizTalk as a Service Provider), and Web Service configurations that the WCF Adapters communicate with through Send Ports (BizTalk as a Service Consumer/client). Below is an example of what may be registered.

BizTalk WCF Adapters Cataloged

Centralized Configuration allows a BizTalk Architect to store configurations about WCF Adapter bindings (individual components that make up a complete communication channel for the WCF Adapter). Using the Centralized confguration of SO-Aware, you can store binding information, behavior information, extension information, security details, transport protocol information, line of business information and etc. We include binding and behavior templates that even provide a dynamic user interface  for configuring the commonly changed settings. This feature facilitates the use of BizTalk’s WCF LOB adapters such as SAP, Oracle and Oracle E-Business Suite, and SQL. Below is an example of registering the Oracle E-Business Suite LOB Adapter binding and  binding template.

image

With Dependency Modeling, you can quickly view which services depend on each other. A reason why this is important in BizTalk is simple. Say you have a business process flow that requires order information to be uploaded to two different web services in one atomic transaction, such as an Order Header Service and a Order Detail Service. This business process can be modeled using BizTalk’s Orchestration design. Below is an image depicting the business process flow.

image

When these services change later, who will remember that these services are dependent upon each other, and that this orchestration depends on both of these services? SO-Aware can easily depict this dependency:

image

Dynamic Resolution

Another option we have with BizTalk Server is using Dynamic Adapters. Dynamic adapters allow for runtime resolution of adapters such as WCF Adapters. This resolution can be performed in two BizTalk components: Orchestrations and pipeline components. Using SO-Aware, dynamic runtime resolution is very easy to do. SO-Aware yields a REST-ful interface to every aspect of its repository. This means that any component which supports invoking .Net code, such as BizTalk’s orchestrations and pipeline components do, will support querying SO-Aware for adapter configuration. Included in the SDK is an example of how to use a BizTalk Orchestration with Dynamic ports and adapters to implement a runtime resolution solution. Below is an example of code inside the BizTalk Orchestration of invoking dynamic runtime resolution.

image

In the above example, the ProcessOrderInformation Orchestration is dynamically resolving the OrderDetailPort  which is configured at runtime to send data to the OrderDetailService registered inside SO-Aware. At runtime, when BizTalk Receives an Order, it will split the order into a Header message and a Order Details message and send both messages to their respective services in one atomic transaction. The differenece being that the OrderDetailService’s binding, configuration, address, and transport protocol will be dynamically resolved using SO-Aware api calls. The Order Detail Service is registered inside the SO-Aware Repository and when queried will return the WCF service configuration details. Below are some depictions of the Order Detail Service SO-Aware registration.

image

 

image

Remember this example was using an Orchestration, however the same applies to BizTalk Pipeline components as well. In the next posting, I will discuss how SO-Aware can be utilized with ESB 2.1 and BizTalk 2010 together.

Using JQuery with SO-Aware

As a quick start, for those that don’t know, SO-Aware is a service metadata repository, very similar to UDDI, except without the complexity, all based on a REST-ful architecture. If you do know you can skip to the explanation of how it works, or skip to the how to section. It supports registry of SOAP, REST, and OData based services, written using Microsoft’s WCF technology stack, as well and Java based web services. However, that’s not all. SO-Aware has five major capability buckets: Centralized Configuration, Service Testing, Dependency Modeling, Activity Monitoring and last but not least Service Cataloging.

Centralized Configuration allows you take full advantage of Microsoft’s WCF based services, by containing a central repository database for all WCF Configurations. This feature allows you to allows you to store and retrieve all information pertaining to WCF configurations. You can retrieve endpoints, bindings, behaviors, Url’s, security information, just to name a few. You can also dynamically change your bindings and configurations so that all you existing services can point to this central location for your configuration.

Service Testing allows you to test registered services. One derived idea about Service Cataloging and Centralized Configuration, is that if SO-Aware knows about your service and communication protocol, then testing becomes simple. It’s simple because if you registered any security, binding, message type format information about the service, then all SO-Aware needs to do is query this information and build the communication stack and messages to send to the service for testing. What better tool to test than the one that understands how to communicate with the Service.

Dependency Modeling allows you to build a diagram of service versions, and the dependencies of the service version. Thus if you needed to see which services depended each other, you have a view into this.

Activity Monitoring allows you see tracked events and aggregations about registered services, such which operations were invoked, and how many services were sent message over a period of time, and many other dimensions and measurements.

Service Cataloging allows you to store and retrieve custom metadata about Services, Service Versions, and environment details. Using these features, allow you to query the catalog for information about services, and service versions dynamically.

Which leads us into the primary discussion of this post, How to Use AJAX and JQuery with SO-Aware. SO-Aware provides many API’s for retrieving and updating the registered information. As a developer you can use SO-Aware’s OData API’s, .Net API’s, Microsoft’s Windows PowerShell, and lastly Javascript, AJAX and/or JQuery API calls.

Recently, a customer asked if SO-Aware can be used with a Javascript client application. The Client wanted to build a web site that could use AJAX, and JQuery to retrieve binding, uri, and service version information. So naturally, when asked, we responded with an explanation and a “How-To” on how to do it. Thus I figured I’d share it with you all.

Underneath the hood, SO-Aware utilizes WCF Data Services to expose various API support for querying and updating registered services inside SO-Aware’s repository. WCF Data Services are based on a REST-ful architecture that uses OData feeds, to as its message protocol format. As a developer the quickest and simplest way to access SO-Aware is to use the REST styled Uri’s for its data. For example, to see a listing of Services registered inside SO-Aware, you could use a uri similar to http://localhost:8088/SO-Aware/ServiceRepository.svc/Services, as this would return an OData ATOM feed of all registered Services.

WCF Data services support a number of optional query parameters, also termed “query options” that allow access to the queried data through different options hanging off the end of the uri as query parameters. One option such as filter allows an application to filter the results based off a string function, math function, logical operator, or expressions. For example if you wanted to look for all services registered in SO-Aware with the versions greater than 1.0 you could write a query option filter as such: http://localhost:8088/SO-Aware/ServiceRepository.svc/ServiceVersions?$filter=MajorVersion ge 1 and MinorVersion gt 0 I’ll discuss more query options in another post, however to read about all supported query options, read the SO-Aware documentation.

One of the interesting decisions that the WCF Data Services team decided was to not enable support for a specific query option by default. This option is the $format option. By Default WCF Data services does not allow you to use the $format query option. This option allows developers and applications to change the format of the return data coming from a WCF Data Service. The $format option supports three types of formats: JSON, XML and ATOM. Not to worry here, we enabled this option for the SO-Aware service repository, thus you can choose which format you want to use. The way we enabled this option is similar to various blogs and posts on the subject matter, although we used a custom WCF Behavior instead. We only support two of the three formats: JSON and ATOM. Which shouldn’t be that big a deal since ATOM is xml anyway… By enabling this option, you can use AJAX, and JQuery with javascript clients to retrieve JSON based queries.

One of the main challenges with using AJAX and JQuery with web services is the security model around accessing remote services. Most AJAX implementations, flat outright do not support it, while others use some sort of “policy access” security file to allow specific Url’s for AJAX/JQuery access. One of the easiest ways to bypass the policy access or non supported impelementations is to use a “Bridge” methodology. A Bridge methodology is a simple idea. Because of security restrictions accessing remote services, the idea entails building a local server side AJAX/JSON formated supported web service which forwards the calls to the SO-Aware repository. By building the “Bridge”, it bypasses the security restrictions, and allows for the AJAX/JQuery client side interactions. In our SO-Aware SDK Samples, we have build an example “Bridge” using Microsoft’s ASP.Net, however Java Server Pages will work just as well. The project in reference is the JQueryAjaxSolution project. It’s code is really simple and straight forward.

Build an ASP.NET Bridge to wrap the SO-Aware ServiceRepository service for use with AJAX/JQuery clients.

Steps:

1. For you existing ASP.NET web site, that wants to use the JQuery and AJAX, add a new Asp.Net Web form to your site.

2. Add a WCF Data Service reference to your project. You can use the DataSvcUtil program, and point your reference to your install of SO-Aware.

3. Inside the Page_Load method add the following code:

   1: protected void Page_Load(object sender, EventArgs e)

   2:         {

   3:             Response.ContentType = "application/json";

   4:             SOAwareService.ResourceRepositoryContext ctxt = new ResourceRepositoryContext(new Uri( "http://localhost:8088/SO-Aware/ServiceRepository.svc"));

   5:

   6:             ctxt.Credentials = System.Net.CredentialCache.DefaultCredentials;

   7:

   8:             var serviceName = Request.QueryString["ServiceName"];

   9:             var serviceVersion = Request.QueryString["ServiceVersion"];

  10:             var category = Request.QueryString["Category"];

  11:

  12:             var services = ctxt.ServiceVersions

  13:                  .Expand("Service")

  14:                 .Expand("Soap")

  15:                 .Expand("OData")

  16:                 .Expand("Rest")

  17:                 .Expand("ServiceDependencies")

  18:                 .Expand("DependantServices")

  19:                 .Expand("TrackingProfile")

  20:                 .Expand("ConfigurationCategory")

  21:                 .Where(sv => sv.Name == string.Format("{0}({1})", serviceName, serviceVersion) && sv.ConfigurationCategory.Name == category);

  22:             var service = services.FirstOrDefault();

  23:             var serviceId = service.Id;

  24:

  25:             HttpClient client = new HttpClient("http://localhost:8088");

  26:             client.TransportSettings.Credentials = CredentialCache.DefaultCredentials;

  27:             HttpRequestMessage req = new HttpRequestMessage("GET", string.Format("/SO-Aware/ServiceRepository.svc/ServiceVersions(guid'{0}')?$expand=Service,Soap,OData,Rest,ConfigurationCategory,ServiceDependencies,DependantServices,TrackingProfile&$format=json", serviceId));

  28:             var result = client.Send(req);

  29:             Response.Write(result.Content.ReadAsString() );

  30:

  31:         }

  32:     }

4. At this point you have a “Bridge” based off an ASP.NET web page that supports JSON, AJAX and JQuery.

5. Now open up your web page that needs to use JQuery or AJAX, or create a new web page.

6. Include your JQuery and AJAX javascript files.

7. Type your JQuery code, here’s an example:

   1: <script id="jquerySample"  type="text/javascript" >
   2:     function GetData() {
   3:         jQuery.ajax(
   4:         { url: String.format("webform1.aspx?Servicename={0}&ServiceVersion={1}&Category={2}",
   5:             ServiceName.value, ServiceVersion.value, Category.value),
   6:             type: "GET",
   7:             dataType: "json",
   8:             success: function (data, textStatus, XmlHttpRequest) {
   9:                 Details.innerHTML = "<p>JSON: " + JSON.stringify(data, null, '\t') +"</p>";
  10:             }
  11:         });
  12:     }

</script>

   2:     <h2><% ViewData["Message"]%></h2>

   3:     <p>

   4:         To learn more about ASP.NET MVC visit <a href="http://asp.net/mvc" title="ASP.NET MVC Website">http://asp.net/mvc</a>.

   5:     </p>

   6:     <p>SO-Aware AJAX Jquery Sample</p>

   7:

   8:    <p>Type In SO-Aware URL:</p><input type="text" id="SOAwareUrl" />

   9:    <p>ServiceName:</p><input type="text" id="ServiceName" />

  10:    <p>ServiceVersion:</p><input type="text" id="ServiceVersion" />

  11:    <p>Category:</p><input type="text" id="Category" />

  12:    <p><input type="button" id="SendResults" value="Resolve Service Endpoint Information" onclick="GetData()"/></p>

  13:    <p>Details:</p><span id="Details" style="background: azure; width=180px; height=120px" />

8. You’re done, You have a Bridge ASP.NET web page that supports JQuery and AJAX clients.

9. clip_image002

10. clip_image004

Web Service Discovery in SO-Aware (Part 2)

In my last post we discussed what was Web Service Discovery, and how it’s two design patterns provide us with different implementations, such as UDDI, SO-Aware, and WCF 4.0 Discovery. In this post we will discuss WCF 4.0 Discovery, and how SO-Aware handles the two modes of WCF Discovery.

How Discovery works today

WS-Discovery provides a protocol to discover services that are coming into and leaving from a network. As a service joins the network, it informs its peers of its arrival by broadcasting a Hello message; likewise, when services drop off the network they multicast a Bye message. WS-Discovery doesn’t rely on a single node to host information about all available services as UDDI and SO-Aware. Which SO-Aware alleviates this for WCF registered services by downloading a cached copy of the configuration to the WCF Service, such that the WCF Service Host retrieves it’s setting from the cached copy if SO-Aware is down. Rather, for WS-Discover each node forwards information about available services in two ways, an ad hoc fashion or a managed mode. This reduces the amount of network infrastructure needed to discover services and facilitates bootstrapping. My Business partner, Jesus Rodriguez has an excellent blog posting on WS-Discover with WCF 4.0. When using WS-Discovery with WCF 4.0, we can use the SO-Aware Service Repository as both the ad hoc and managed mode implementation.

How SO-Aware Handles WCF 4.0 Discovery ad hoc mode

Using WCF 4.0 Discovery requires running the SO-Aware Web Site and ServiceRepository service with .Net 4.0. You can tell ASP.NET to dymanically recompile the web site and data service to use the .Net 4.0 framework. Follow the steps outlined in this post (http://tellagostudios.com/how-setup-so-aware-run-net-40) to do this.

Next is to turn on WCF Add Hoc Discovery in the Service Host. The Current SO-Aware Service Repository host uses the default WCF DataService Service host. By Default this host does not know anything about the discovery mechanism. Thus I have created a new Service Host Factory which injects details about the discovery mechanism. This host is named: Tellago.ServiceModel.Governance.Data.DataServiceHostFactory, which derives from the default WCF DataServiceHost, and adds the Discovery behavior, Annoucement and Discovery Endpoints to the DataServiceHost, making it support the WCF 4.0 Discovery mechanism. Open the ServiceRepository.svc markup with a text editor, and change it’s Factory to point to the new DataServiceHostFactory as such:

Factory = "Tellago.ServiceModel.Governance.Data.DataServiceHostFactory, Tellago.ServiceModel.Governance.Data"

Look for the Tellago.ServiceModel.Governance.Data project inside the SDK folder (\Samples\.Net40\WCF Discover) and compile the project. Copy the resulting dll into the bin directory of the SO-Aware web site, run the site, and now you have Ad hoc discovery mechanisms. The ability to add a service version for (OData, REST, or Soap) and remove a service version entry generates "Online" and "Offline" announcement broadcasts to all listening announcement endpoint clients. (Note: this example only works when the client announcement endpoints are on the same subnet as the SO-Aware ServcieRepository service) It also uses the managed mode approach by creating a Discoverable Host and proxy which clients can use to listen for these announcements. I stress, this is sample code only as it shows how to implement such a thing with SO-Aware.

The way the custom ServiceHost Factory works is by injecting a WCF Message Inspector into the DataServiceHost. The Message Inspector parses the DataService message contents. You know, a funny thing about WCF Data Services is that, the message content is sent and received in a binary, base64 encoded, multi-part message format. Basically it’s a batch message of Xml, post/merge/put/delete messages all delimited by "\r\n\r\n—changeset_{guid}". This was not fun to parse let me tell you. In anycase the inspector looks within the Batch message for the Xml contents, Post method verb in the message properties, and a term="" the entity type in question. If the Entity type is "ServiceVersion" and the Post contains an term="restdescription" or "odatadescription" or "soapdescription" then I know the insert or update message is a Service Version. Upon "Offline" broadcasts, the message inspector looks for a Http Delete verb method in the message properties of the batch and looks for a "ServiceVersion" term as well as the "restdescription", "odatadescription" and "soapdescription" entity names (loweredcased…) to determine if it should broadcast "Offline" messages to announcement endpoints.

To create a client application that uses the Ad Hoc discovery service is rather simple. First you create a DiscoveryClient instance, and add a UdpDiscoveryEndpoint instance to its constructor. The UdpDiscoveryEndpoint already contains a hardcoded Uri which points so some standard urn:docs-oasis-open-org:ws-dd:ns:discovery:2009:01
address version, believe it or not, this is a valid Uri address. Once the discovery client is created, create event handlers to listen for when the Discovery Client has completed it Find operation. Start or invoke the Find operation by using the FindAsync() method of the Discovery Client and wait for the responses, you’re done.

Here’s a code example:

   1: class Program

   2: {

   3:     static DiscoveryClient client;

   4:     static AnnouncementService svc = new AnnouncementService();

   5:     static ServiceHost announcementHost = null;

   6:  

   7:     static void Main(string[] args)

   8:     {

   9:         client = new DiscoveryClient(new UdpDiscoveryEndpoint());

  10:         client.FindProgressChanged += new EventHandler<FindProgressChangedEventArgs>(client_FindProgressChanged);

  11:  

  12:         client.FindCompleted += new EventHandler<FindCompletedEventArgs>(client_FindCompleted);

  13:         client.FindAsync(new FindCriteria(typeof(ServiceRepositoryDataService)));

  14:        

  15:  

  16:         announcementHost = new ServiceHost(svc);

  17:         announcementHost.AddServiceEndpoint(new UdpAnnouncementEndpoint());

  18:         svc.OnlineAnnouncementReceived += new EventHandler<AnnouncementEventArgs>(svc_OnlineAnnouncementReceived); svc.OfflineAnnouncementReceived += new EventHandler<AnnouncementEventArgs>(svc_OfflineAnnouncementReceived);

  19:        

  20:         announcementHost.BeginOpen((result) =>

  21:         {

  22:             announcementHost.EndOpen(result);

  23:         }, null);

  24:  

  25:         Console.WriteLine("Searching...\n");

  26:         Console.ReadLine();

  27:  

  28:     }

  29:  

  30:     static void svc_OfflineAnnouncementReceived(object sender, AnnouncementEventArgs e)

  31:     {

  32:         Console.WriteLine("Service went offline: {0} ", e.EndpointDiscoveryMetadata.Address );

  33:     }

  34:  

  35:     static void svc_OnlineAnnouncementReceived(object sender, AnnouncementEventArgs e)

  36:     {

  37:         Console.WriteLine("Service came online: {0} ", e.EndpointDiscoveryMetadata.Address );

  38:     }

  39:  

  40:     static void client_FindCompleted(object sender, FindCompletedEventArgs e)

  41:     {

  42:         if (e.Error == null)

  43:         {

  44:             if (e.Result.Endpoints.Count > 0)

  45:             {

  46:                 Console.WriteLine("Find Completed Ran\n{0}", e.Result.Endpoints[0].Address);                 

  47:             }

  48:             if (client.InnerChannel.State == System.ServiceModel.CommunicationState.Opened)

  49:             {

  50:                 client.Close();

  51:             }

  52:             Console.ReadLine();

  53:         }

  54:         else

  55:         {

  56:  

  57:             Console.WriteLine(e.Error.ToString());

  58:             Console.ReadLine();

  59:         }

  60:     }

  61:  

  62:     static void client_FindProgressChanged(object sender, FindProgressChangedEventArgs e)

  63:     {

  64:         Console.WriteLine("Percentage Complete:\n{0}%\n", e.ProgressPercentage.ToString());

  65:     }

  66: }

  67:  



How SO-Aware Handles WCF 4.0 Discovery in managed mode

One of the disadvantages to using Ad Hoc discovery is that all the clients must reside on the same subnet due to the UDP multicasting protocol. Managed mode Discovery, provides a way to discover services on other protocols besides the UDP multicast protocol. It works by way of a custom "Discoverable Proxy". The discoverable proxy can query for online services through discovery endpoints, or through any custom means. The concept is simple, a client/service interested in discovery first calls the discoverable proxy and asks it for registered services. The discoverable proxy then returns a list of "Online" services as announcements or some other transport mechanism. This is where SO-Aware can really shine. The SO-Aware Discoverable proxy uses the SO-Aware central repository to discover services being "Online" and "Offline".

The steps to use this discoverable proxy are simple, you can continue to use the announcement client endpoints, as long as they reside on the same subnet, and you can create a client that queries directly into the proxy using the address of the discoverable proxy. In this release, the address is hard coded to net.tcp://localhost:7777/discoverableproxy endpoint address. The next release, v2 of the SO-Aware implementation will allow you to take full advantage of the configuration and repository to control which binding and what uri listening address.

The way the SO-Aware discoverable proxy works is by creating a class that derives from the DiscoveryProxy base abstract class. This class allows you to implement four asynchronous paired (Beginxx and Endxx) methods, OnBeginOnlineAnnouncement, OnBeginResolve, OnBeginFind, OnBeginOfflineAnnoucement

Here’s a sample OnBeginFind method example:

   1:  

   2:      protected override IAsyncResult OnBeginFind(FindRequestContext findRequestContext, AsyncCallback callback, object state)

   3:      {            List<EndpointDiscoveryMetadata > services = new List<EndpointDiscoveryMetadata>();

   4:          var ctxt = new ResourceRepositoryContext(this.ServiceRepositoryUri);

   5:          ctxt.Credentials = System.Net.CredentialCache.DefaultCredentials;

   6:  

   7:          var Soaps = from sp in ctxt.SoapDescriptions.Expand("ServiceVersion")

   8:                      select sp;

   9:          foreach (var soapDescription in Soaps)

  10:          {                

  11:              foreach (var ep in soapDescription.Endpoints )

  12:              {

  13:                  EndpointDiscoveryMetadata discovery = new EndpointDiscoveryMetadata();    

  14:                  discovery.Address = new EndpointAddress(ep.Address );

  15:                  discovery.Extensions.Add(XElement.Parse(String.Format("<ServiceVersionName>{0}</ServiceVersionName>", soapDescription.ServiceVersion.Name)));

  16:                  services.Add(discovery );                    

  17:              }

  18:          }

  19:          var odatas = from od in ctxt.ODataDescriptions.Expand("ServiceVersion")

  20:                       select od;

  21:          foreach (var oDataDescription in odatas)

  22:          {

  23:              EndpointDiscoveryMetadata discovery = new EndpointDiscoveryMetadata();

  24:              discovery.Address = new EndpointAddress(oDataDescription.MetadataURI );

  25:              discovery.Extensions.Add(XElement.Parse(String.Format("<ServiceVersionName>{0}</ServiceVersionName>", oDataDescription.ServiceVersion.Name)));

  26:              services.Add(discovery);

  27:          }

  28:          var rests = from rs in ctxt.RestDescriptions.Expand("ServiceVersion")

  29:                      select rs;

  30:          foreach (var restDescription in rests)

  31:          {

  32:              EndpointDiscoveryMetadata discovery = new EndpointDiscoveryMetadata();

  33:              discovery.Address = new EndpointAddress(restDescription.BaseURI );

  34:              discovery.Extensions.Add(XElement.Parse(String.Format("<ServiceVersionName>{0}</ServiceVersionName>", restDescription.ServiceVersion.Name)));

  35:              services.Add(discovery);

  36:          }

  37:          var query = from service in services

  38:                      where findRequestContext.Criteria.IsMatch(service)

  39:                      select service;

  40:  

  41:          var queryCache = from service in cache

  42:                           where findRequestContext.Criteria.IsMatch(service)

  43:                           select service;

  44:  

  45:          foreach (var metadata in query)

  46:          {

  47:              findRequestContext.AddMatchingEndpoint(metadata);

  48:          }

  49:  

  50:          foreach (var endpointDiscoveryMetadata in queryCache)

  51:          {

  52:              findRequestContext.AddMatchingEndpoint(endpointDiscoveryMetadata );

  53:          }

  54:          return new OnFindAsyncResult(callback, state);          

  55:  

  56:      }



Briefly let’s talk about each of the 4 asynchronous methods, if you want details, MSDN is the best resource for that. The names are self explanatory. BeginFind allows you to lookup services when you find them you just call the findRequestContext.AddMatchingEndpoint(). This method announces to the clients a service has be found immediately. The BeginOnlineAnnouncement and BeginOfflineAnnouncement methods are there for you to cache them for clients calling into your proxy for this information. And lastly BeginResolve allows you to query your cache or in our case the SO-Aware ServiceRepository for service versions to return an Endpoint Address and Service Version Metadata.

To create a client application that uses this managed mode design is also simple. First you create a DiscoveryEndpoint instance and use the same binding and endpoint address of the discoverable proxy you are communicating with. In the case of SO-Aware, it’s binding is a NetTcpBinding with all the default settings, with the endpoint address set to: net.tcp://localhost:7777/discoverableproxy. After you create the DiscoveryClient, you can add the events are mentioned earlier (FindProgressChanged, FindCompleted) to listen for the "Online" and "Offline" messages. To start the process you invoke the FindAsync() method accordingly

Here’s a code example:

   1: Console.WriteLine("Now using Managed proxy on net.tcp://localhost:7777/discoverableproxy");

   2:  

   3: DiscoveryEndpoint ept = new DiscoveryEndpoint(new NetTcpBinding(), new EndpointAddress("net.tcp://localhost:7777/discoverableproxy"));

   4:  

   5: client = new DiscoveryClient(ept);

   6:  

   7: client.FindProgressChanged += new EventHandler<FindProgressChangedEventArgs>(client_FindProgressChanged);

   8:  

   9: client.FindCompleted += new EventHandler<FindCompletedEventArgs>(client_FindCompleted);

  10:  

  11: client.FindAsync(new FindCriteria(typeof(ServiceRepositoryDataService)));

  12:  

  13: Console.ReadLine();

  14:  



Putting it all together: Summary

Once you take advantage of the WCF 4.0 ws-discovery mechanisms you can have clients listen for registered services, such that when service versions are registered with SO-Aware, a client can immediately be notified the service metadata is now in SO-Aware. In the upcoming version, the Service "Online" and "Offline" announcement features will be more exact and technically correct. We will do this by actually checking if the registered Service is truly online. SO-Aware can do this through it’s pinging mechanism, which is included in the current version of SO-Aware. It’s the "Supports Is Alive" option in SO-Aware. We’ll talk more about this option in another posting.

Figure 1 SO-Aware Pinging Mechanism – Supports Is Alive

So, to make a long story short, there are a ton of possibilities here. SO-Aware can integrate into WMI, and SCOM to provide for a realtime monitoring of services going "Online" and "Offline". SQL Reporting mechanisms can be built to monitor who adds service versions, who takes them offline, and other things. Sharepoint can be used and cataloged itself using it’s REST based services. Mobile devices can have clients created that go through SO-Aware to retrieve its configuration and all be notified when new services are available to take advantage of them.

You need to Become SO-AWARE!!!

Take a look at some of the screen shots:

Figure 2 Client Using Broadcast Ad Hoc Discovery

Figure 3 Client Using Managed Mode Discover through SO-Aware

Figure 4 Adding and Removing Service Versions in SO-Aware

Web Service discovery in SO-Aware (part 1)

Introduction

Web services have been around for over 10 years now in some form or fashion. What this means is that there are millions of services in existence, on the public internet as well as your private network, and they mostly all lack at least one thing in common, discovery. Discovery is the mechanism by which web services and clients can be made aware of other web services and clients on a network. Discovery comes in two basic designs. One design pattern is where a client or service queries a central repository for registered services to find and discover details about it. Another design pattern is based on broadcasting and notification, where a special protocol is used to announce a client or service’s joining and leaving the network.

When developing web services with the Microsoft technology stack, Microsoft first introduced the Soap toolkit. Afterwards, ASP.NET ASMX was created to quickly build web services, following that the WSE extensions, and finally as its most current technology Windows Communication Foundation with version 4.0 of the .Net 4.0 Framework. WCF and .Net 4.0 Framework brings some interesting new features to the table, one in particular deals with web service Discovery using the WS-Discovery standard. This new standard takes the broadcasting and announcement design pattern to discovery.

How Discovery Worked Prior to Today

Prior to this version of WCF, the only supported discovery mechanism, was the querying of a central repository, usually UDDI sometimes through a class called the Metadata Resolver, but mostly through the UDDI API classes and methods. UDDI provided a central registry to store information about available services. It supplied a catalog where clients and services could find services that meet their needs. It is more similar to a yellow-page phonebook directory of information allowing the client or service to find other services by name, address, contract, category, or by other metadata. UDDI can be thought of as a very poor implementation of DNS for Web services.

Needless to say, the UDDI API stack never really made headlines. The reason being the API was far too complex, and it’s development model was too clunky to fit the Web Service executing platform, and no-one except the inventors of UDDI, understood tModels especially if all you wanted was a url to a service named “OrdersService”. What I mean by this is that Web Services were based off the HTTP protocol, with SOAP additions in a stateless model. While UDDI was created on the HTTP protocol, and using SOAP, UDDI implementations were too clunky and dealt more with the Business, and business model than with the actual service to really make an impact on the web service technology space. For example, to retrieve the details you were looking for, you needed to create multiple tModels, tModelRequests and parse through multiple tModelResponses just to get to the specific information being requested, instead of “1 simple query call, that’s all.”

I digress, a little. There was nothing wrong with the query Design pattern, just the UDDI implementation. We at Tellago Studios, believe that the query design pattern has its benefits as long as the implementation follows more of a web service, SOA centric perception of a central repository. SO-Aware does just that. It provides a SOA Centric RESTful implementation to query a central repository. If you want the Url of a registered service, just query it using OData. However, SO-Aware doesn’t stop there. Because of it’s RESTful implementation, I will show in this two part article posting how SO-Aware can also take advantage of both the querying and the second design pattern: broadcast and announcements, using the WS-Discovery protocol and WCF 4.0.  Part 1 of this article continues to talk about querying a central repository, whereas part two discusses WCF 4.0 and the WS-Discovery mechanism and how SO-Aware can be leveraged with it. (Read Part 2 here)

How SO-Aware Handles Querying the SO-Aware Service Repository

Let’s first tackle the first design pattern for discovery, which is querying a central repository. SO-Aware provides its own Service metadata Repository, and contains a set of classes for both a Client and Service to use for querying it, as well as the core WCF Data Service: SO-Aware Service Repository which uses the OData protocol to query directly. (To see more information on these classes and the core Service Repository view our videos on Server Side API parts 1 and 2 along with the Client Side API parts 1 and 2). To keep in tune with the querying the central repository, I’ll focus on the client side classes for this post and handle the server side querying in another post.

To query the central repository on the client SO-Aware provides two classes, ConfigurableProxyFactory and ConfigurationResolver. These classes can be found inside the Tellago.ServiceModel.Governance.ServiceConfiguraion.dll assembly.

To borrow from the documentation: The ConfigurableProxyFactory class is used to configure a client proxy for communicating with the SO-Aware Service Repository to retrieve configuration, behavior and binding information. It is meant to be used within client source code for dynamically resolving configuration information. Behind the scenes it uses the WCF ChannelFactory class to create WCF Communication objects through channels for sending and receiving queries and responses. The steps to use this class are simple:

  1. Create an instance of this class in the Client / Service application
  2. Inside the constructor set the SO-Aware Configuration query
  3. Create a WCF Channel based on the service contract interface expected for communication.
  4. Invoke your service operations, you’re done.

Here’s a code example:

ConfigurableProxyFactory<ISampleService> factory = new ConfigurableProxyFactory<ISampleService>(new Uri(http://localhost:8088/SO-Aware/ServiceRepository.svc “), “SampleService(1.0)”, null);

ISampleService service = factory.CreateProxy();

var response = service.DoOperation(dataToService);

Using the ConfigurationResolver class is just as simple. This class is used to manually resolve SO-Aware Service Repository Bindings, Endpoints, and Endpoint Behaviors. This class is meant to be used within a client or service source code for dynamically and manually resolving configuration information. The Steps to use this class are:

  1. Determine which WCF component is needed for communication: Binding, Endpoint, or Endpoint Behavior.
  2. Create an instance of the ConfiguationResolver class.
  3. Manually create any supporting WCF components for communication such as Addresses, Bindings, Endpoints and Contracts, depending on what components you already have.
  4. Invoke the ConfigurationResolver.Resolve*() methods to retrieve Bindings, Endpoint information such as EndpointAddress, Contracts, and Behaviors and let SO-Aware create the rest of the components for you.

Here’s a code example:

ConfigurationResolver resolver = new ConfigurationResolver(“http://localhost:8088/SO-Aware/ServiceRepository.svc&#8221;)

var binding = resolver.ResolveBinding(“wsHttpBinding_ISampleServiceBindingName”);

var endpoint = resolver.ResolveEndpoint(“wsHttpBinding_ISampleServiceEndpointName”);

var behavior = resolver.ResolveEndpointBehavior(“wsHttpBinding_ISampleServiceBehaviorName”);

SampleServiceClient proxy = new SampleServiceClient(binding, endpoint.Address, new DnsEndpointIdentity(“localhost”) );

var result = proxy.InvokeSomeOperation();

Well that’s all for now, in the next part, part two, I will go into details of How SO-Aware works with WCF 4.0 Discovery using ad hoc mode.

How to Setup SO-Aware to run on .Net 4.0

Steps

SO-Aware was created using the .Net 3.5 framework. To support WCF 4.0 we must tell the Web Server hosting SO-Aware to use ASP.Net 4.0, and tell ASP.NET to recompile the application using the.Net 4.0 assemblies. One of the nice features about ASP.NET is that it can dynamically recompile your web application upon startup. To set this up open IIS Manager and navigate to your Application pool that runs SO-Aware:

To do this Open IIS:

  1. First switch the application pool SO-Aware is running inside of to use .Net 4.0.

  2. Next install asp.net 4.0 into IIS if it’s not already done by navigating to a Command prompt as administrator navigate to this folder:
  3. C:\windows\Microsoft.NET\Framework64\v4.0.30128\
  4. Type in this line:

aspnet_regiis.exe -ir

  1. Lastly, change the web.config file to utilize the .Net 4.0 assemblies references by copying this web config file located below. (Remember to change the <connectionStrings> to point to the correct SQL Server or SQL Express databases, and the <appSettings>
    values to point to the correct SO-Aware ServiceRespository.svc and Tracking.svc urls:

<configuration>

<system.diagnostics>

<sharedListeners>

<add name=”EventLog” type=”System.Diagnostics.EventLogTraceListener” initializeData=”SO-Aware”>

</add>

</sharedListeners>

<sources>

<source name=”Tellago.ServiceModel.Governance.Monitoring” switchValue=”Warning, Error”>

<listeners>

<add name=”EventLog”/>

</listeners>

</source>

</sources>

</system.diagnostics>

<connectionStrings>

<add name=”ResourceRepositoryContext” connectionString=”metadata=res://*/ResourceRepository.csdl|res://*/ResourceRepository.ssdl|res://*/ResourceRepository.msl;provider=System.Data.SqlClient;provider connection string=&quot;Data Source=.;Initial Catalog=ServiceRepository;Integrated Security=SSPI;MultipleActiveResultSets=True&quot;” providerName=”System.Data.EntityClient”/>

<add name=”TrackingEntities” connectionString=”metadata=res://*/TrackingModel.csdl|res://*/TrackingModel.ssdl|res://*/TrackingModel.msl;provider=System.Data.SqlClient;provider connection string=&quot;Data Source=.;Initial Catalog=ServiceRepository;Integrated Security=SSPI;MultipleActiveResultSets=True&quot;” providerName=”System.Data.EntityClient”/>

</connectionStrings>

<appSettings>

<add key=”ResourceRepositoryUri” value=”http://localhost:8090/SO-Aware/ServiceRepository.svc“/>

<add key=”TrackingRepositoryUri” value=”http://localhost:8090/SO-Aware/TrackingData.svc“/>

<add key=”ChartImageHandler” value=”storage=memory;deleteAfterServicing=true;”/>

</appSettings>

<system.serviceModel>

<serviceHostingEnvironment aspNetCompatibilityEnabled=”true”/>

<services>

<service behaviorConfiguration=”tracking” name=”Tellago.ServiceModel.Governance.Monitoring.TrackingService”>

<endpoint address=”Tracking” binding=”netTcpBinding” bindingConfiguration=”tracking” contract=”Tellago.ServiceModel.Governance.Monitoring.ITrackingService”/>

</service>

<service name=”Tellago.ServiceModel.Governance.Monitoring.TrackingServiceWeb” behaviorConfiguration=”tracking”>

<endpoint address=”” binding=”webHttpBinding” bindingConfiguration=”trackingWeb” contract=”Tellago.ServiceModel.Governance.Monitoring.ITrackingServiceWeb” behaviorConfiguration=”trackingWeb”/>

</service>

</services>

<bindings>

<netTcpBinding>

<binding name=”tracking” maxBufferPoolSize=”524288″ maxBufferSize=”5242880″ maxReceivedMessageSize=”5242880″>

<readerQuotas maxStringContentLength=”5242880″/>

</binding>

</netTcpBinding>

<webHttpBinding>

<binding name=”trackingWeb” maxBufferPoolSize=”524288″ maxBufferSize=”5242880″ maxReceivedMessageSize=”5242880″>

<readerQuotas maxStringContentLength=”5242880″/>

<security mode=”TransportCredentialOnly”>

<transport clientCredentialType=”Windows”/>

</security>

</binding>

</webHttpBinding>

</bindings>

<behaviors>

<endpointBehaviors>

<behavior name=”trackingWeb”>

<webHttp/>

</behavior>

</endpointBehaviors>

<serviceBehaviors>

<behavior name=”tracking”>

<serviceThrottling maxConcurrentCalls=”16″ maxConcurrentSessions=”20″/>

<serviceMetadata httpGetEnabled=”false”/>

</behavior>

</serviceBehaviors>

</behaviors>

</system.serviceModel>

<system.web>

<!–

Set compilation debug=”true” to insert debugging

symbols into the compiled page. Because this

affects performance, set this value to true only

during development.

–>

<compilation debug=”true” targetFramework=”4.0″>

<assemblies>

<add assembly=”System.Web.Mvc, Version=2.0.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35″/>

<add assembly=”System.Web.Abstractions, Version=4.0.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35″/>

<add assembly=”System.Web.Routing, Version=4.0.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35″/>

<add assembly=”System.Data.Linq, Version=4.0.0.0, Culture=neutral, PublicKeyToken=B77A5C561934E089″/>

<add assembly=”System.Data.Services.Client, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089″/>

<add assembly=”WindowsBase, Version=4.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35″/></assemblies>

</compilation>

<!–

The <authentication> section enables configuration

of the security authentication mode used by

ASP.NET to identify an incoming user.

–>

<authentication mode=”Windows”/>

<identity impersonate=”true”/>

<authorization>

<deny users=”?”/>

</authorization>

<!–

The <customErrors> section enables configuration

of what to do if/when an unhandled error occurs

during the execution of a request. Specifically,

it enables developers to configure html error pages

to be displayed in place of a error stack trace.

<customErrors mode=”RemoteOnly” defaultRedirect=”GenericErrorPage.htm”>

<error statusCode=”403″ redirect=”NoAccess.htm” />

<error statusCode=”404″ redirect=”FileNotFound.htm” />

</customErrors>

–>

<pages controlRenderingCompatibilityVersion=”3.5″ clientIDMode=”AutoID”>

<controls>

<add tagPrefix=”asp” namespace=”System.Web.UI.DataVisualization.Charting” assembly=”System.Web.DataVisualization, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35″/>

</controls>

<namespaces>

<add namespace=”System.Web.Mvc”/>

<add namespace=”System.Web.Mvc.Ajax”/>

<add namespace=”System.Web.Mvc.Html”/>

<add namespace=”System.Web.Routing”/>

<add namespace=”System.Linq”/>

<add namespace=”System.Collections.Generic”/>

</namespaces>

</pages>

<httpHandlers>

<add path=”ChartImg.axd” verb=”GET,HEAD” type=”System.Web.UI.DataVisualization.Charting.ChartHttpHandler, System.Web.DataVisualization, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35″ validate=”false”/>

</httpHandlers>

</system.web>

<!–

The system.webServer section is required for running ASP.NET AJAX under Internet

Information Services 7.0. It is not necessary for previous version of IIS.

–>

<system.webServer>

<validation validateIntegratedModeConfiguration=”false”/>

<modules runAllManagedModulesForAllRequests=”true”>

</modules>

<handlers>

<remove name=”UrlRoutingHandler”/>

</handlers>

</system.webServer>

<runtime>

<assemblyBinding xmlns=”urn:schemas-microsoft-com:asm.v1″>

<dependentAssembly>

<assemblyIdentity name=”System.Web.Mvc” publicKeyToken=”31bf3856ad364e35″/>

<bindingRedirect oldVersion=”1.0.0.0″ newVersion=”2.0.0.0″/>

</dependentAssembly>

</assemblyBinding>

</runtime>

</configuration>

Once you’ve copied over this configuration file, recycle your IIS App Pool and restart your IIS Server. Open the SO-Aware web site, and verify everything works as planned.

Using the SO-Aware PowerShell Provider to Register Services

With the creation of SO-Aware ( http://www.tellagostudios.com ) , a Service Metadata Repository that can do all sorts of things, comes the added benefit of using Microsoft’s Windows PowerShell. To borrow a line from Microsoft’s site, PowerShell is a command line shell and scripting language that helps IT professionals achieve greater control and productivity to more easily control system administration and accelerate automation. We at Tellago Studios, implemented a set of over 40 different standard command line tools to help with the management and control of the SO-Aware Repository. Outlined below are the steps needed to install, register, configure and add SOAP and RESTful Services inside SO-Aware.

How To Install the SO-Aware PowerShell Provider

  1. First Download the PowerShell Provider Setup files here: http://www.tellagostudios.com/projectfiles/SO-Aware_PowerShell.zip
  2. Next extract the files to a directory of your choosing and execute the SO-Aware PowerShell.msi windows installer file as an administrator.
  3. Step through the setup dialogs as such:
  4. After the install has successfully completed, the SO-Aware powershell provider named: “SOAwareSnapIn” are registered to the PowerShell global system. To utilize the SO-Aware commandlets you must open a PowerShell Session and add the SOAwareSnapIn to the current session. At this point, the SO-Aware powershell commandlets can be used.

Working with a PowerShell Session

  1. Open up a PowerShell prompt using Windows Server 2008 (See an icon that looks similar to this:
  2. Inside the PowerShell Command prompt type: add-PSSnapIn SOAwareSnapIn
  3. The above line adds the SO-Aware PowerShell commandlets into the current PowerShell Session. Each session you start up will need to have the snapin added in order to use the SO-Aware PowerShell commmandlets. (There are scripts the you can run in order to alleviate this repetitive process, after the install just navigate to Start->All Programs-> Tellago Studios->SO-Aware->SOAware.ps1)
  4. After you’ve loaded up all the SO-Aware commandlets, we must configure the current SO-Aware PowerShell Provider session to point to the SO-Aware Service Repository. Type in: set-SWEndpoint –url ‘http://localhost:8088/SO-Aware/ServiceRepository.svc&#8217; (Just remember to replace the actual URL with your URL pointing to the SO-Aware Service Repository.

Registering a SOAP Service inside the SO-Aware Repository

  1. To Add any type of service, REST, ODATA or SOAP we use the add-swservice commandlet. To determine exactly which service type we use the Style property.
  2. Inside the PowerShell Command prompt type:    add-swservice -Name ‘People Finder’ -Namespace ‘http://tempuri.org&#8217; -Style ‘SOAP’
  3. Style supports three options: SOAP, REST, ODATA

  4. You should see text that says that the Service was added successfully, otherwise an error will pull up showing you the issue such as:

  5. To complete the Service Registration, we need to Add a ServiceVersion inside the ServiceRepository and attach it to the newly added service.
  6. To Add a Service Version, we use the add-sw[Type]ServiceVersion command let, where [Type] can be either REST, OData, or, SOAP.
  7. However to add the Service Version we first need the Service ID of the newly added service, thus use the get-swServices commandlet to retrieve the newly added service, Service Id
  8. Type: get-swServices –Name ‘People Finder’
  9. This will retrieve all services with the name ‘People Finder’, you could actually leave –Name off and all services registered would return:

  10. When you master PowerShell, you can also filter out values, and pipe it’s contents into other commandlets, such that all I really needed was one command to get the ServiceID, and pass it’s value into the Add Service Version commandlet.
  11. Now that we have the service id, we can add it to the Service Version using the add-swSoapServiceVersion commandlet.
  12. Type: add-swSoapServiceVersion –ServiceId ‘eab4cad9-e0ad-4fd7-941c-2f3641dd89e5’ –Version ‘1.0’ –url ‘http://localhost:1245/PeopleFinder.svc?wsdl‘ –Configuration ‘Production’
  13. Where the ServiceID is the actual Guid from your service, the Version is the actual version number you want to add, Configuration is the category name you want to file the Service Version under, and the Url is the live url location to your WSDL for the SOAP Service.

  14. That’s it. You’re done!
  15. Goto to your SO-Aware Web Site and see your results:
  16. Added Service
  17. Added Service Version:
  18. Added Version Details, inferred from WSDL

Registering a REST and OData Service inside the SO-Aware Repository

  1. To Add either REST or ODATA we use the add-swservice commandlet.
  2. Inside the PowerShell Command prompt type:    add-swservice -Name ‘REST WebService’ -Namespace ‘http://localhost/WebStyle.Host/Service.svc&#8217; -Style ‘REST’
  3. Style supports three options: SOAP, REST, ODATA

  4. You should see text that says that the Service was added successfully
  5. To complete the Service Registration, we need to Add a ServiceVersion inside the ServiceRepository and attach it to the newly added service.
  6. To Add a REST Service Version, we use the add-swRESTServiceVersion command let other than that we use the add-swODataServiceVersion.
  7. However to add the Service Version we first need the Service ID of the newly added service, thus use the get-swServices commandlet to retrieve the newly added service, Service Id
  8. Type: get-swServices –Name ‘REST WebService’
  9. This will retrieve all services with the name ‘REST WebService’, you could actually leave –Name off and all services registered would return:

  10. Now that we have the service id, we can add it to the Service Version using the add-swRESTServiceVersion commandlet.
  11. Type: add-swRESTServiceVersion –ServiceId ‘7ea50e22-7262-4d09-8611-07cf62e8e555’ –Version ‘1.0’ –BaseUri ‘http://localhost:4371/WebStyle.Host/Service.svc ‘ –Configuration ‘Production’
  12. For OData we’d use:
  13. Type: add-swODataServiceVersion –ServiceId ‘7ea50e22-7262-4d09-8611-07cf62e8e555’ –Version ‘1.0’ –ODataUrl ‘http://localhost:4371/WebStyle.Host/Service.svc ‘ –Configuration ‘Production’
  14. Where the ServiceID is the actual Guid from your service, the Version is the actual version number you want to add, Configuration is the category name you want to file the Service Version under, and the BaseUri is the live url location to your Rest Service, or the ODataUrl is the live url location to your REST OData Service.

  15. That’s it. You’re done!
  16. Goto to your SO-Aware Web Site and see your results:

  17. Service Version details:
  18. What you’ll notice here, is that for this version of the SO-Aware PowerShell Commandlets, REST Add Operations are not supported yet, thus you’ll have to Add the operation using the Web based UI manually.

Working with some of the remaining SO-Aware Commandlets

  1. Case in point here, all SO-Aware Commandlets will follow a simple pattern:
    1. GET-SW*s
    2. ADD-SW*
    3. SET-SW*
    4. REMOVE-SW*
    5. Where the “*” is the name of the artifact or entity you’d like to query or update.
  2. So for example if you wanted to get all services you would use: GET-SWServices
  3. To see which parameters that are needed for each commandlet, you can just execute the commandlet inside the PowerShell prompt and the PowerShell prompt will ask you for the required parameters.
  4. To see any optional parameters, you’ll need to download the documentation at: http://tellagostudios.com/sites/default/files/SO-Aware%20API%20Documentation.zip

Introducing SO-Aware

Some of you know that we’ve been working double time, once as a consultant developing and implementing SOA based solutions, and another as a developer creating a developing a REST based registry called: “SO-Aware“. Well I’d like to introduce you to our newest product here at Tellago Studios, Inc.

SO-Aware

What is SO-Aware? Well the answer lies in the story behind how and why we created it.

The answer is simple: we basically got tired of going into clients and hearing the same questions over and over again. These questions were “Which service does so and so? where does it reside? Is it behind the Firewall? Is there security implemented on it? Who built that service when we already have one that does that? How do we version this service because the old version doesn’t support this, and that, but the new one needs to? What other services depend on that one?
What’s OData, and how do we integrate it? Can we govern our REST endpoints? and on and on… I hope you get the picture…

We decided to put our heads down for while, work a double shift if you call it that, and answer those questions with a product you can download today, for free!!! called SO-Aware

I’ll let the product speak for itself, in the meantime, download it, check it out, and use it. Leave feedback so we can make it better.