Pages Navigation Menu

Coding is much easier than you think

Microsoft Office Graph API

Posted by on Mar 24, 2019 in Azure, Office 365 | 0 comments

MS Graph acts as common service Endpoint to access all the services available in Office 365 using simple REST API calls.

  • Developers can now consume data through a single public endpoint (https://graph.microsoft.com) – using simple REST calls or with an SDK available on just about any platform
  • Permissions are secure, granted using OAuth protocols.
  • Administrators can grant fine grained permissions (instead of global / service administrator or similarly elevated roles) to applications that are accessing data or can revoke access if needed.
  • Graph API schema can be extended for custom Application requirements.
  • Developers gain access to traverse the rich sets of user-centered data, insights and updates available beneath the wide and growing range of Microsoft 365 products through Graph API

Whats New in Microsoft Graph

In the last 2 years (2017 & 18 ). Microsoft has invested a lot in Office 365 & Azure cloud services. One of the major improvement is the development of Graph API to access entire Azure cloud analytics from single endpoint –
https://graph.microsoft.com.

Lets see, what is Microsoft Graph API capable of in simple terms-

source – channel 9

Who should use?

Let us say, I have an Interface – which may be a Web APP or Device/ Native Apps or Bots, which needs to response dynamically to user behavior with rich insights about your Organization, users, activities with dynamically growing market place Apps, then, MS Graph is the right API for development.

Ex- If I am a SharePoint Developer, then I will be using SharePoint OData REST API’s for consuming O365 Site/List/User context. We wont be able to access furthermore, like user mails, Onedrive files, Team updates or activities from other O365 Apps, Managing Security & App Permissions.

In above scenarios, where we need access to entire Office 365 Apps specific to user, we can make use of O365 Graph API’s.

Which APP’s are supported?

As of now, below O365 App End Points are supported by MS Graph API –

*(beta) versions will not be available for General Availability.

Advertisements
Read More

How to Publish .NET Core solution to VSTS – Visual Studio Teams Service Online Git Repository using Git Command Line

Posted by on Jan 26, 2019 in Azure | 0 comments

[5 mins read]

In this tutorial, we will discuss about how to publish .NET Core solution to VSTS – Visual Studio Teams Service Online Git Repository using Git Command Line.

Pre-requisitie-

1. Copy the Clone URL for your Git Repository. Incase, if you are new to VSTS, follow the steps in this link to Create Repository in Visual Studio Team Service(VSTS) online,
2. This tutorial assumes you already have an existing .NET project or Solution, which needs to be pushed. Incase, you dont have any project, create one in Visual Studio Community Free IDE(File-> New Project->Web->(choose .NET Core)Web API & select Create New Solution.
3. Git For Windows has been installed in your local machine.
4. Open Windows Command Prompt(Start-> Run-> Type ‘cmd’) or Windows Powershell or Open Terminal inside VS code) with Project Folder/solution Path.

Lets see how to publish Local Solution to Git VSTS Online using simple Git commands

1. Initialize Git by running below command:

Code:

git init .

Above command initialize empty Git repository in current directory inside Project Folder

2. Connect to Remote Branch of Online Repository use below command:

Code:

git remote add WebApiProjectName https://tfs.company.com/tfs/Project/_git/Asp-Net-Core-Web-api

Above command helps to establish connection between Local repository to VSTS online Git Repository.

3. Push your local changes using below command:

Code:

git push origin

or
git push origin <--branch or Feature>

Above command helps to Push your manually updated code changes to the Origin repository before final commit to Online branch.
4. Add all your local File changes as ‘Stage’ by using below command:

Code:

git add --all
git status

Above command helps to list all the files, which are finalized to push into Online Repository.

5. Finally, to commit & push your changes Run the below command:
Code:

git commit -m "Initial Push to Online Repo from Local"


git status

If succeed, you will see the Status as shown below with all the details of File count, Compressed Objects, Number of threads Modified.

While doing Final commit, local repository code is modified as Git Objects, then generated as Package file with Index. On each addition to commit, index will get modified with ‘message -Flag’ value & these changes can be best viewed with Tree view of commits available in VSTS Online –> Code(History View)–> Sort By Tree View.

Above solution helps to publish code through Command Line Interface. Sometimes, we might be challenged to publish code to Online repositories from our local desktop with out visual studio IDE.

Happy coding !:)

Cheers!:)

Read More

Core Cloud Services – Azure data storage options

Posted by on Jan 21, 2019 in Azure | 0 comments

3 mins

In this Tutorial, we will discuss about the factious & advantages of Azure Data Storage Options over the Classical On-premise cost consuming Storage Model.

So, What we will be reading for next 3 minutes. Yes!

  1. We will explore different Azure Data Storage Options,
  2. Comparison between Azure Data Storage & On-premise Data Storage

 

Introduction:

As Microsoft said their next 5 years will be heavily focused on Internet Of Things & Artificial Intelligence. To concur this, they have proposed strong Platform for storing Data in Cloud, as discussed earlier in Core Services Platform post for PaaS, Iaas & SaaS.

 

Why Should we Choose Azure Data Storage?

  • Automated backup and recovery: No need for 24×7 support by Admin Team.
  • Replication across the globe: Microsoft Data centers are located across the Globe in 8 Continents & 16 Places. Hence, data will never get lost, if we follow the Security Model, Storage Model & Governance Model proposed by Microsoft.
  • Support for data analytics: Microsoft Azure Data Lake Storage Gen2, App Insights will take care of your Application Data & Site Analytics in-depth.
  • Encryption capabilities: We might winder, if we use Public Cloud adopted by Microsoft. DO I need to worry about my Data Storage Leakage or sharing. Well, Microsoft has clearly depicted its capabilities through Container Virtualization, VM Virtualization & Cryptographic Key policies for Multi Tenant Workspace.
  • Multiple data types: BLOB, File, Queues helps you to save different types of File Format & its Server rendition Models.
  • Data storage in virtual disks: Azure can store only up to 8TB of data in Virtual disks. Well, good option tobe considered for Large Blob Images, Videos, Multimedia Formats & its Search Features.
  • Storage tiers: Auto assignment of VMs or Storage Containers, helps our clients to sleep peacefully ( Provided, if Budget limit is setted in Azure Portal using Azure Cost Management Preview Feature).

What are Types of data Azure Supports?

Microsoft arrived at the below data types based on the Storage Design Requirements. Lets see few examples:

  1. Structured – Data with Proper Schema. Ex- Azure SQL Database as Database as a service (DaaS)
  2. Semi-Structured – Data can be identified by Unique Keys or Tags, can contain ‘N’ number of Columns Ex: Azure Cosmos DB, No SQL Database
  3. Un-Structured – Documents, Files, Blob( Image, Video) Ex: Storage Containers( Blobs, Files, Queues, Tables)

1. Azure SQL Database

 

  • Azure SQL Database is a relational database as a service (DaaS)
  • SQL Database is a high-performance, reliable, fully managed and secure database.
  • Migrate existing SQL Server databases using the Azure Database Migration Service(Backend Service –Microsoft Data Migration Assistant)

2. Azure Cosmos DB

  1. Azure Cosmos DB is a globally distributed database service built for PaaS Strategy,
  2. Supports Schema less database – you can update Metadata at any point of time. Ex: I have created Azure Cosmos DB & Collection, added First Row or Item with 2 Metadata(Title, ID). In Second Row, I can push new Metadata Fields like( Image URL, Link URL, Description) in addition to existing Fields. It is one of the super cool Feature for Azure developers and it supports SQL Queries also.
  3. Global availability makes it Top DB tier for Architects choice.

3. Azure Blob storage

 

  1. Azure Blob Storage is unstructured,
  2.  Blob Storage can manage thousands of simultaneous uploads, massive amounts of video data, constantly growing log files, and can be reached from anywhere with an internet connection.
  3. Blobs aren’t limited to common file formats. A blob could contain gigabytes of binary data streamed
  4. Ability to store up to 8 TB of data for virtual machines

 

4. Azure Data Lake Storage Gen2

  1. Contains both structured and unstructured data.
  2. Mainly Used for Analytic purposes.
  3. Have Big Data File System capabilities

 

 

 

 

5. Azure Files

  1. Azure Files offers fully managed file shares in the cloud that are accessible via the industry standard Server Message Block (SMB) protocol.
  2. Azure file shares can be mounted concurrently by cloud or on-premises deployments of Windows, Linux, and macOS.
  3. Azure Files uses the Server Message Block (SMB) protocol that ensures the data is encrypted at rest and in transit.

6. Azure Queues

  1. Azure Queue storage is a service for storing large numbers of messages that can be accessed from anywhere in the world.
  2. A single queue message is up to 64 KB in size, and a queue can contain millions of messages.
  3. Consist of Sender & Receiver Components for sharing messages. A sender can be Web API, Web App, Mobile App and receiver can be any service application capable of listening or accepting to messages.
  4. Mainly used for distributing loads between servers to balance the Network Load.

 

Comparison between Azure Data Storage & On-premise Storage Options:

Azure Capabilities over On-premise Servers in simple terms based on our experience. Add your Pros & Cons at comments section.<<will be added here with Courtesy links>>

  1. A nearly limitless pool of raw compute, storage, and networking components which will expand dynamically based on Customer Utilization. Whereas, On-premise for each additional Server, Database or Load Balancer, needs to pay upfront fee for Infrastructure & Admin dependency is required.
  2. Speech recognition and other cognitive services that help make your application stand out from the crowd. Whereas, need to identify third-party tools for each set of Enterprise Applications
  3. Analytics services that enable you to make sense of telemetry data coming back from your software and devices. Whereas, in On-premise analytics is not 90%. Since it will make additional calls or it will be dependent on browser Agent. Third party Tools like AppD, Omniture still lacks a long way behind Microsoft Telemetry services.
  4. Azure Data stored can be mounted concurrently by cloud or on-premises deployments of Windows, Linux, and macOS. Whereas, server needs to be identified for Different Softwares. Needs tobe categorised as WFE, Application, Database Server. Needs dedicated Team & Disaster Mode/ Back up recovery options should be considered.
  5. Azure Search supports data crawling, Artificial Intelligence, Cognitive Support, Telemetry Analysis. Whereas, On-premise need to configure Several Application Servers like SharePoint Search Database Content, Crawled Database Content, Indexed Database content, Business Intelligence License procurement.
  6. Automation Deployment Model supports Hybrid deployment to Azure/ On premise both using Azure DevOps/ Continuous Integration using Multi Platform Gates like Powershell, NodeJS, Jenkins(1000+ Marketplace Tools)/ Continuous Build/ Container Image Deployment Model. Whereas, in On-premise still Automation is night mare requires huge customization & procurement.
  7. Azure Governance Model is well protected by Inbuilt Security Policies, Compliance Policy Centers, Encryption Policies, Replication & Disaster Recovery Policies. Whereas, Data Classification, Policy Identification & Data Breach Identification requires support from Legal Agencies, Enterprise Consultants & continuous data watch required.
  8. Azure keeps the Operation Cost at Minimal & Running Cost depending upon utilization. Disk Storage( with Tier Model – Hot, Cool, Archival Model) Cost Management predicted seems to be 98% accurate. Hence, clients can start preparing their budget based on the Architecture proposed & adopted in well advance time. Whereas, In On-premise Performance, Network calls, Operation Threshold Limit can affect Operation cost & running cost. For each Additional resource requires Procurement, Configuration, Admin Support & Budget Allocation in middle of nowhere :)

We can summarize above points in terms of:

  1. Cost effectiveness (Ex – pay-as-you-go subscription Model),
  2. Reliability (Ex- Data backup, load balancing, and disaster recovery strategies),
  3. Storage types (Ex- Azure Cosmos DB, Storage Containers( Blobs, Files, Queues, Tables)
  4. Agility (Ex- Requirements and New technologies change or adoption).

 

Hope above points justifies the pricing Model of Microsoft :)

Happy Coding!

Cheers :)

Read More

How to create a udcx file for Main Data Connection in infopath?

Posted by on May 6, 2013 in Infopath 2010, SharePoint Tutorials | 0 comments

 

For create a UDCX file for Main data connection, you may follow these steps as:

Open your InfoPath form and clik on “Manage Data Connection…” under the Fields in the InfoPath form, and then in new pop up window, select your Main Connection and click on “Convert to Connection file” as:

and then give your Data connection library’s path (on SharePoint site) in the textbox as:

for example: http://sharepoint_site/DataConnectionLibrary/ABC_Main_Connection.UDCX

and click on OK.

After this you can see your Main Connecion’s UDCX file is available in your site’s Data Connection Library.

Read More

SharePoint 2013 CAML Designer

Posted by on Apr 17, 2013 in SharePoint 2013, SharePoint Tutorials | 0 comments

 
SharePoint 2013 CAML query builder will be a great Tool for developers, it will help to build CAML query while using the Client Side Object Model in SharePoint 2013.

SharePoint 2013 CAML query builder with advanced features 2013 can be downloaded from the below link:

  • http://sharepoint.biwug.be/SitePages/Caml_Designer.aspx

 

Read More