News & Posts
How to Git
Initializing your project Repo and Commit/Push to Remote Origin When you create a project in Azure DevOps, there are multiple options to start using Git on your project. These options are mentioned below: Push an existing repository This is the best option to use when setting up a project from scratch. This option requires you to create a local new Git repository and create a project then commit and push it to the remote origin.
Follow the steps below to do all of the above:
Creating branches on the Remote/Origin repo using Azure DevOps How to create a branch for work items in remote repo
How to pull the changes back to local repo
How to remove heads for all remote branches on local repo
NOTE: You can also using the command below if you have not deleted the remote branch in Azure DevOps: git branch –rd <branch-name> git pull This will remove remote and local branches
How to remove local branches from local repo when they are not needed
|
SharePoint Projects Intake Process
Caching problem with IE
My customer has been complaining about data not being saved in the Angular JS app over and over. I have tried troubleshooting in Chrome and everything appeared to be working perfectly fine until I started browsing my Angular app in Internet Explorer. I debugged through the javascript and since I am using Angular Resource, there is certain abstraction that I have to live with. However, the request has been submitted correctly on the server and the update is performed successfully in the database. The response resource is collected in the success promise but the screen only updates when I have IE toolbar open. At that point I started googling and found its the issue with IE cache. The following code in the config fixed my cache problem: window.app.config(function($httpProvider) { $httpProvider.defaults.cache = false; if (!$httpProvider.defaults.headers.get) { $httpProvider.defaults.headers.get = {}; } // disable IE ajax request caching $httpProvider.defaults.headers.get['If-Modified-Since'] = '0'; } The downside of this of course is No Caching, if the data is not rapidly changing on the site, its always pulling and fetching data from the database. I wonder if there is a means to provide the configuration for specific areas of the site pages. I believe that can be done writing your custom interceptor which will essentially append a unique value to the request url so that IE wouldn't recognize the URL and always pull from server. |
Managed Metadata under the hood
Metadata repository at a logical and physical level Physical Level: Managed Metadata Services Application (MMS); when managed metadata is enabled in our SPS2010 Central Administration Services, a managed metadata service and connection are created automatically. The service identifies the database to be used as the term store, and the connection provides access to the service, so that this service can be consumed by our site collections. Logical Implementation: Managed Metadata Terms (MMT) is a hierarchical collection of centrally managed terms that we can define, and then use in our SPS 2010, for items such as Pages, Lists and Libraries. When we create new managed terms these are stored in the database that is specified in the MMS when we publish a managed metadata service, a URL to the service is created, These URL are then used by our site collections to consume these services. Managed Metadata Connections (MMC): To consume managed metadata, a web application such as our Authoring environments in all of our farms must have a connection to a MMS. A Web application can have connections to multiple services, and the services can be local to the Web application or remote to another farm as long as the farm can talk to each other. When a managed metadata service is first created, a connection to the service is created automatically in the same Web application as the service. A Term A term is a word or phrase that can be associated with an item in SharePointServer2010 Global MMT VS Local MMT
The development lifecycle of a metadata repository OOTB SPS 2010 offers a centralised UI so that Term Sets can be easily and logical created, edited, deleted and managed: Term Sets can also be imported using a spread sheet. Term Sets can also be programmatically created, edited, deleted however it is a bit more difficult and error prone to go this way. IMPORTANT: We cannot use our regular deployment lifecycle to deploy new metadata using Visual Studio solutions. As I have explained in the first question, we can consume a MMT as long as the following prerequisites are met:
View the nature of specific secure farm architecture which uses SQL Authentication between our Authoring and Publishing I can foresee issues such as content migration from Authoring to Publishing breaking, search not be able to properly index and more. The idiosyncrasies of a SharePoint 2010 metadata repository I have listed some of the know issues below:
|
Document & Records Management in SharePoint 2007 and 2010
I was inquired about the difference between document management and reocrds management features in SharePoint 2007 and 2010 so I decided to write a blurb about it. Enterprises think about enhancing the records management capabilities and often get flummoxed with the OTF products like TRIM and customized EDRMS solution built on SharePoint platform. This is not about recommending one or the other but pointing out the capabilities built into SharePoint. Document Management feature of SharePoint Server for enterprise provides
the capability of creating workspaces and repositories to securely store
documents along with the meta-data information. Depending on the enterprises'
documents storing needs and the information architecture, a solution that
compliments the enterprise infrastructure must be designed with the following
key components;
Content Types, Team sites, Document libraries, Workflows & Document workspaces The state of the art SharePoint enterprise search capability provides the means to crawl and index the content within the documents along with the meta-data to be easily searchable by the users. SharePoint document libraries offer highly customizable framework to storing documents and can be configured to manage document versioning history, content approval and workflows. Records Management feature of SharePoint extends documents management capability and sets the focus on identifying information stored in document libraries and workspaces as records. SharePoint enables organizations looking to manage and archive their enterprise records through a document management system (eDRMS) by allowing them to store information in Records Center sites. These Record Centers are set to execute business rules that adhere to the records management policies in the organization by maintaining the electronic filing process, labelling, history, audit trails, routing and disposition of these documents. The routing & disposition workflows provide a framework to create custom retention policies. Records Management has come a long way since the initial launch of SharePoint in 2003. There was a very limited functionality in terms of identifying the records and applying retention policies to them. It is only after MOSS 2007 when the Records Center was introduced with routing & disposition workflows and built-in audit functionality. SharePoint keeps track of all the events and changes that occur within the system which enables managers and administrators to report on activities performed by a business user in terms of how they interact with the system. However having to design the RDMS system involved customizations of the SharePoint objects for implicit Records Management capabilities for the end users. Today Enterprises need fast, user-friendly, secure & reliable records management system which coheres to the organizational policies of record keeping and auditing. To name a few, DCAA, ISO, and CMMI are some of the standards which are being used as baseline by these enterprises and SharePoint 2010 assures to comply with them by introducing the following capabilities which will help these enterprises on a massive scale to achieve their goals. The document & records management capabilities are greatly enhanced in the latest version of SharePoint (SP2010) which offers many useful features OOTB including: Corporate taxonomy & term stores; enables enterprises to manage groups of hierarchical corporate terms Meta-data publishing / content hub; provides means to manage global document templates and content types that can be shared across departments & functional areas. This helps implement business classifications and filing policies for record keeping Content organizer; exhibits the ability to manage routing/retention/disposition workflows across document libraries and workspaces per business classification Document stores; renders conceptual boundaries around documents across business areas Meta-data navigation; simplifies navigation by enabling users to browse information based on business classification, location and custom meta-data. The above capabilities provide extremely efficient way to manage records and enable enterprises to design & implement a highly effective and customized eDRMS.
|
Workflow Task Manager Activity for SharePoint
Sometime ago I came across the issue of handling multiple tasks in parallel and manage independentally. Turned out WWF provides a friendly replicator activity. So I decided to use it to develop a workflow that upon kick off reads items from other lists for assignment, business areas and notification lists and spwan multiple tasks dynamically in parallel. The workflow is set to be completed when all the tasks are actioned by their assignees without waiting one task to be finished before the next task is created (Parallel tasks).
How it works
Thankfully WF comes up with a looping activity to run a specific child activity multiple times and is called Replicator activity. It has its own challenges and limitations to work within for instance, you can only add one activity as a child of a Replicator activity. The first and foremost thing one has to get their head around is to understand what happens behind the scenes when the replicator is set to run in parallel mode. And not the least, it is important to understand the crucks about workflow execution specially for SharePoint world. Now there is heaps of documentation on MSDN on it so I wouldn't go into details but I would mention a few which I had to face when using Replicator.
For further details and download hop on to codeplex project here.
|
Welcome to my site
Welcome to my new site. Here you will find information about me and the type of work I am involved in, what are my expertise etc etc. |