JA Sugite - шаблон joomla Оригами


It's not how many hours you put in with a client or on a project. It's the quantity and quality of your energy - your focus and force - that determine whether that time is valuable.

Software Applied

Pragmatic Odoo Hotel PMS Vs Odoo Community Hotel PMS

Feature Pragmatic Odoo Hotel Management Odoo Community Hotel Management
Different type of reservation can be handled In our product we can handle different type of reservations like reservation at desk, through web, through gds ×
Agent commision against reservation Full flow is available for the same ×
Advance payment can be taken against reservation ×
Facility to take identification details of customer in reservation form and same is passed to folio. or directly take data in folio form ×
Facility to take pickup details if applicable Full flow is available for transportation Not available
In Folio also can take advance payment ×
Foreign Exchange details available
Hotel bill are linked normal and Restaurant POS Available not working for 9 Available not working
Reservation Dashboard (One Quick Overview of All Rooms Reservation Status) Available in deatils, facility to filter data based on date range, reservation status, shop, room type Available with date range only and it works only if Folio is confirmed once.
Restaurant Table booking in case of reservation Available with various validation Available without validation
Table booking order amount computation and invoice linking with Reservation if applicable Available, computed well and linked with Reservation Available, not computed well but linked with Reservation
Order amount computation and invoice linking with Reservation if applicable Available, computed well and linked with Reservation Available, not computed well but linked with Reservation
KOT, BOT process flow for Restauran Process Both available only KOT is available
Laundry system flow Full flow is available ×
Hotel Housekeeping
Request for Repair / Replacement flow for housekeeping ×
Issue Material flow for housekeeping ×
Reporting for Room Reservation,Restaurant, POS Available working fine Available but not correct data is coming
Restaurant Banquet Hall enquiry and booking flow ×
Agent Commision flow ×
GDS Integration ×
Web Booking Engine ×

VOIP Integration with OpenERP

Outgoing Call -

To make a outgoing call from openerp, just go the customer menu, open any partner form there you will see the “Dial” option in front of Phone number.
We used soft phone (Magic App) for the outgoing call.

Incoming Call Existing Partner -

For any incoming call, system will popup one notification. Where it will show the customer name (if exist in the system) and phone number. When user clicks on it system will open dashboard with all customer details.

Customer Details (Global Search) -

After clicking on the incoming call popup notification for existing partner, system will open dashboard with customer details. System will show the sales order, partners, customer invoices, delivery orders, customer payments, customer refunds, etc.

Incoming Call (Unknown Number) -

For any incoming call, system will popup one notification. Where it will show the Unknown (if customer is not exists in the system) and phone number. When user clicks on it system will open the Leads page.

Leads Page -

In case of unknown incoming call, if user clicks on notification popup then system will open the leads page.

Missed Call -

If user unable to received an incoming call then system will show this call as a missed call. Here user will have two open, whether he can discard it by clicking on “Cancel” button or he can click “Ok” button to see the details of it.


Odoo 9 – Import Vendor Bills from Post-Finance

Now a days every one emphasise on digital transactions than keeping hardcopy of documents. Postfinance is a financial institution which allows their customer to upload / Download e-invoices. This offers safest way to send / receive e-inovices.

We have come up with a module to import all vendor bills in Odoo from Post – Finance portal.

This module works with all bank account types i.e. BVR, IBAN, Normal Bank acocunt etc. It will pull all the invoices pdf's as an attachments in Vendor bills created in odoo

Build IoT Applications in the Cloud on fast-track with AWS IoT: AWS IoT using Seeeduno Cloud and Groove IoT kit Overview

AWS IoT is a managed cloud platform that enables you to connect IoT devices to AWS Services and other devices and provides a secure data access and interactions to process and act upon device data in both offline and online states.

AWS IoT can connect billions of devices and send trillions of messages, and can process and route those messages to AWS endpoints and to other devices in a reliable and secure manner. With AWS IoT, your applications can communicate with all your devices, all the time. AWS IoT makes it super easy to use AWS services such as DynamoDB, RDS, Lambda, Kinesis, S3, and Machine Learning, o build IoT applications that gather, process, analyze and act on data generated by connected devices completely in the cloud.

Components of AWS IOT

AWS IoT Device SDK

AWS IoT Device Software Development Kit enables your devices to connect, authenticate, and exchange messages with AWS IoT using the HTTP/MQTT protocols. The SDK supports C, JavaScript, and Arduino.

Device Gateway

All connected devices communicate to the Device Gateway - HTTP or MQTT Protocol
  • Highly Fault Tolerant Protocol for intermittent connectivity
  • Light footprint
  • Low n/w bandwidth requirement
  • Exchange messages using a publication/subscription model
  • One-to-one and one-to-many communications (Broadcast)
  • Support over a billion devices without provisioning infrastructure

Authentication and Authorization

  • Connectivity over TLS (Transport Layer Security successor to SSL) TLS Certificates are easily created
  • Supports the AWS method of authentication (called ‘SigV4’) as well as X.509 certificate based authentication.
  • Connections using HTTP can use either of these methods, while connections using MQTT use certificate based authentication.
  • Those device certificates can be provisioned, activated and associated with the relevant policies that are configured using AWS IAM.


  • Establishes an identity for devices and tracks metadata such as the devices’ attributes and capabilities.
  • Assigns a unique identity to each device that is consistently formatted regardless of the type of device or how it connects.
  • Does not expire as long as you access or update your registry entry at least once every 7 years

Device Shadows

  • Creates a persistent, virtual version, or “shadow,” of each device that includes the device’s latest state.
  • The Device Shadows persist the last reported state and desired future state of each device even when the device is offline.
  • Device Shadows make it easier to build applications that interact with your devices by providing always available REST APIs.
  • Device Shadows let you store the state of your devices for up to a year for free.

Rules Engine

  • The Rules Engine makes it possible to build IoT applications that gather, process, analyze and act on data generated by connected devices at global scale without having to manage any infrastructure.
  • The Rules Engine evaluates inbound messages published into AWS IoT and transforms and delivers them to another device or a cloud service, based on business rules you define.
  • The Rules Engine can also route messages to AWS endpoints including AWS Lambda, Amazon Kinesis, Amazon S3, Amazon Machine Learning, and Amazon DynamoDB. External endpoints can be reached using AWS Lambda, Amazon Kinesis, and Amazon Simple Notification Service (SNS).
  • You can create your own rules within the management console or write rules using a SQL-like syntax. Rules can be authored to behave differently depending upon the content of the message.
  • Rule Engine provides dozens of available functions that can be used to transform your data, and it’s possible to create infinitely more via AWS Lambda.

Geo Fencing in Odoo

Using Geo-Fencing API of Google, a module named google_map_fencing is built in Odoo 9 which stores the locations in warehouses by defining a region. It identifies whether a particular pin (location pointed on google map) falls in the defined region or not. Just click on the map over a specific area and you will get a fence over that region, which is defined by a polygon on the google map and can be updated just by dragging the polygon points. The region can be saved per location according to the address and can be viewed or changed as you want. Following are the screen shots related to google map fencing:
  • 1. Consider a google map of a particular region
  • 2. Click around a region example Pune, to get a fence around it. This way you can define a region. The white dots indicate the polygon points, by dragging which you can expand the region
  • 3. Save the changes to again view the defined region on map.
  • 4. Once the fencing is done, drop of pin at any location gets you to know whether that pin falls in the fence or out of it.
  • 5. Option to reset is also provided to clear all the changes
  • 6. Geo-Fencing in Odoo can be used for storing warehouse or contact locations in case of multiple warehouses
  • 7. Easy to implement and understand.

Odoo ERP for Magento 1.x / 2.0

Magento is a very popular e-commerce platform and provides a turnkey solution for running any business online. Magento 2 brings in better performance, flexible architecture and improved administration and reporting. But it still lacks the ERP capability which is required to run a business. Magento can be complemented by Odoo which is a leading open source business suite of application. There is ready bridge between Magento and Odoo which allows to sync data between systems. Odoo has integrated suite of applications for managing Sales, Purchase, Inventory, Accounting, CRM, HRM Shipping and many more applications required to run an e-commerce business.

Magento Dashboard

Odoo Dashboard

Odoo has a very robust warehouse and inventory Management application integrated to Accounting. Additionally if you are running a physical store and an online store, inventories can be utilized between them.Odoo also provides a POS application which can be used in a physical stores. Since Odoo is open source thier are more than 4000 modules available to use.

Odoo 9 : Rolled-up cost for Manufacturing products

It has been always a challenge to know exact cost of the manufacturing nature product in Odoo.

Pragmatic Techsoft Pvt. Ltd. has come up with a new module “pragmatic_rolledup cost” to compute the same.

This module works on BOM defined for the product. Cost of every product defined in a BOM is passed on the main product. It considers quantity as well as price of a product used in BOM.

It works with multilevel BOM as well.

If a product has multiple variants then the cost price of the product shall be updated on individual variants.


Hubspot Integration With Odoo

Hubspot integration module acts as a connector between the hubspot system and odoo system. The contacts and the companies created in hubspot, get created in odoo system as well and vice versa.

A. Contact Creation from Odoo to Hubspot:
  • Once, you create a contact in Odoo, it will be automatically synced in hubspot.
  • Following fields related to contact are synced to hubspot:
      1.Contact Name
      2.Job Position
      7. Company
  • Any change in the contact fields are also updated to the contact in hubspot.
B.Contact Creation from Hubspot to Odoo:
  • When a contact is created in hubspot, it gets synced in Odoo.
  • A scheduler is triggered once a day(configurable), after which the contact gets created.
  • Any change in the contact is updated to the corresponding contact in Odoo.
C. Assign Company to a contact in Odoo
  • Assigning a company to a contact from odoo will add the corresponding contact to that company in hubspot.
  • Similarly, when a company is assigned from hubspot to a contact, it gets reflected in the contact in Odoo after the scheduler is triggered.
  • Following configurations are required to integrate with hubspot:
The timestamp for Modified date for Contact and Company are set to current timestamp by default.

Advanced Printscreen for Odoo 8(Printing Report Analysis)

Printscreen has been and is an attractive feature of Odoo which enables the user to take the printout of the tree view of the selected records.

We have a new module which enables users to take the printout of the selected records in the tree view. After installing the module, a new dropdown will be available in the header of each tree view. No configuration is required, jut installing the module is enough.

The main features of this module are:
  • Ability to export view in both PDF as well as in Excel.
  • Ability to export analytic views in to PDF and Excel including group totals.
  • Hierarchical view of the groups.

The module can be installed as a normal Odoo module. The only python dependency that is need for the module is python-xlwt which is needed to export to xls file. This can be installed using the comment "sudo apt-get install python-xlwt" for Ubuntu. After installing the module, you can see two anchor buttons called "Export to PDF" and "Export to Excel" on the right side of the OpenERP web client as shown

These buttons will only appear if the active view is a tree view. This button wont be visible in any of the other view types. When the button "Export to PDF" is clicked, the report will be exported to PDF or if "Export to Excel", the report will be printed in "Excel". The heading and the totals will be printed in Bold. A sample screenshot of the excel and PDF report is shown below:
The module also allows to print analytic reports in to CSV of excel. The screenshot below shows an analytic view:
You can also print Graph View as PDF format which will be more useful for analysis part

Magento 2 Basic Theme Development

Magento 2 Theme development and customization can be categorised in different level depending on the different developer’s skills. Different level of customisation are:
  • If a developer just wants to change the colour, images or other small changes on his website, then it can be achieved only with the knowledge of CSS. Developer can use Magento default CSS and make his required changes in it.
  • If a developer wants to make some more changes other than CSS, like some changes in the HTML generated through PHTML file. Then the developer can achieve it if he has a little knowledge of PHP and HTML.
  • If a developer wants to make changes in the website structure, like: placing one block to another place, adding new block or completing moving a block to a different page. This can be achieved if the developer also have the knowledge of XML.
  • Finally, if the developer have complete knowledge as mentioned in above three points, then developer can design his own theme.

Pre requisites

  • Previous Magento coding experience
  • Some knowledge of Magento 2
  • Magento 2 fully installed and running smoothly, access to the frontend & admin.

Theme Development:

Similar to the Magento 1, themes are stored inside app/design/frontend directory. Inside this we need to create Vendor directory with your Vendor_name and inside vendor directory we need to create a new directory which your Theme_name. Both name should not have space between them. So after this the structure will be:
Now your structure is in place, you need to declare your theme so that Magento knows it exists and you can set it as an active theme in the admin. Create a theme.xml file within the theme folder, on the root. You can use the code inside of the Blank or Luma theme folders. You can also use the code below. Just insert the name of your theme in the <title> tags. You can also specify a parent theme for fallback purposes.
<theme xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
<title>INSERT THEME NAME</title>
This is the minimum code you need, but you can also declare a theme image. This is a thumbnail image which shows in the admin on your theme page so you can see a preview of what your theme looks like. To add one of these, add the code below in-between the XML nodes, underneath the theme declaration.
<media> <preview_image>media/theme-screenshot.jpg </preview_image> </media>
Change the name of the thumbnail image to that of your filename. Place the image in the following location:
If you don’t have this file in the correct location when you visit your theme page in the admin, you’ll get an error – so make sure your image is in the right place and named correctly.

Theme Registration File:

The last part in declaring your theme is to add a registration.php file to your theme’s root.
Add below code in your registration file:

* Copyright © 2015 Magento. All rights reserved.
* See COPYING.txt for license details.

Theme Basic Structure:

In a design, there are many static files such as javascript, css, images and fonts. They are stored in separate directories in web of theme package. In Magneto 2, there is no skin directory as Magento 1. So all this static files are kept in web folder inside theme root directory. Here are the structure
├── etc/view.xml
├── web/
│ ├── css/
│ │ ├── source/
│ ├── fonts/
│ ├── images/
│ ├── js/
The etc/view.xml file is where you can configure the Magento Catalog image sizes and other things. Copy the etc/view.xml file from one of the default theme’s and edit as necessary.

The last thing you can do before activating your theme, is to add you logo and declare it. The image file can be added to the web/images folder which you created not long ago. This can be whatever file type you like, in this case I’ve used an svg. So to actually tell the theme to use your logo, you create the Magento_Theme/layout folders and add the following code to a default.xml file. Edit to match your requirements.
<page xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
<referenceBlock name="logo">
<argument name="logo_file" xsi:type="string">images/logo.svg <argument name="logo_img_width" xsi:type="number">300 <argument name="logo_img_height" xsi:type="number">300 </arguments>

Composer File:

Composer is a tool for dependency management in PHP. It allows you to declare the libraries your project depends on and it will manage (install/update) them for you.

To distribute your theme as a package, add a composer.json file to the theme directory and register the package on a packaging server. { "name": "vendor_name/theme_nameultimate",
"description": "N/A",
"require": {
"php": "~5.5.0|~5.6.0|~7.0.0",
"magento/theme-frontend-blank": "100.0.*",
"magento/framework": "100.0.*"
"type": "magento2-theme",
"version": "100.0.1",
"license": [
"autoload": {
"files": [

Activate your Theme

Now everything is in place for you to activate your theme. Browse to the admin of your Magento 2 store, and go to Content > Design > Themes. Make sure your theme appears in this list – if it doesn’t, it hasn’t been declared correctly.
When you can see your theme in the path above, browse to Stores > Configuration > Design. Select the right store scope and then change the theme to your newly created theme.

Odoo 8 – Email Reminders To Vendors

Pragmatic Techsoft Pvt. Ltd. has comeup with new module “pragmatic_email_reminders” to send reminder emails to vendor. There are following email notifications categories involved :
  • Delayed Quotation Submission Against RFQ
  • Delayed Incoming Shipments
  • Partial Incoming Shipments
All above mentioned are the reminder emails to the vendors.

This modules has follwoing configuration available

1. Configurations in Odoo is available in Purchase section :

  • This will allow user to set initital delay to shoot email reminders to vendors for delayed quotes submission.
  • Initial delay is considered from the date of Order itself
  • User can setup frequency schedular with interval unit and type
  • max. no. of reminders for vendor per RFQ can be set up

2. RFQ Reminder Configuration

  • This will allow user to set initital delay to shoot email reminders to vendors for delayed quotes submission from a date of order

3. Delay Incoming Shipments

  • This email reminder is used to notify vendor if none of the product received on a scheduled dated in PO. The initial delay considered for this is on and above on the “Scheduled Date” mentioned for a Shipment.
  • User can update new expected date of shipment in Picking if conveyed by vendor.

4. Partial Incoming Shipments

  • This email reminder is used to notify vendor if full quantities in picking are not received.

Magento 2 Module Development

Here we are going to learn, how to develop a simple module in magento 2 and what are the pre requirements to develop a module. Here, we assume that you have successfully installed Magento 2 in your development environment.

After you have successfully installed Magento 2.0 in your development environment and it functions properly, there are two things which we recommend you to do:

Disable the System Cache:
  • Login to Magento admin section.
  • Goto System > Cache Management.
  • Select all types of caches available there.
  • Select Disable option from the dropdown on top left corner of table.
  • Click on submit button. It will disable all the cache in magento system.

Switch your Magento to Developer Mode:
  • Open your development environment terminal.
  • Move to the root location of your magento instance.
  • Run this command: php bin/magento deploy:mode:set developer.

All this information will help you to understand the new structure more easily. Now we will start to learn the module development step by step.

STEP 1: Create a module folder and necessary files to register the module.
In Magento 1.x, we have learned that module folder is created inside one of the code pools inside app/code/(community, core or local). But in Magento 2, no more code pools are available. Now , the module folder will be:


The Pragmatic folder is the module’s namespace, and Helloworld is the module’s name.
Note: If you don’t have the code folder in your app directory, create it manually.
After module folder we will create module.xml file inside app/code/Pragmatic/Helloworld/etc folder.

<?xml version="1.0"?>
<config xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
<module name="Pragmatic_Helloworld" setup_version="1.0.0">

And now we will create a registration.php file to register our module in Magento:


Now, Open your terminal and go to the Magento 2 root. Run the following command from terminal:

php bin/magento setup:upgrade

Now if you want to confirm that your module is registered in magento or not, login to magento admin and move to Stores → Configuration → Advanced → Advanced. Here you can see the list of all enabled module in Magento. One more place where you can check that your module is registered or not is app/etc/config.php. Check the array for the ‘Inchoo_Helloworld’ key, whose value should be set to 1.

STEP 2 : Create Router & Controller
Firstly we will define router by creating a routes.xml file inside app/code/Pragmatic/Helloworld/etc/frontend folder with following code:

<?xml version="1.0"?>
<config xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
<router id="standard">
<route id="hello" frontName="helloworld">
<module name="Pragmatic_Helloworld" />

Here we’re defining our frontend router and route with an id “helloworld”.The frontName attribute is going to be the first part of our URL.
In Magento 2 URL’s are constructed this way:


So in our example, the final URL will look like this: helloworld/index/index

Create index.php controller file inside app/code/Pragmatic/Helloworld/Controller/Index folder with following code:

namespace Pragmatic\Helloworld\Controller\Index;
use Magento\Framework\App\Action\Context;
class Index extends \Magento\Framework\App\Action\Action
protected $_resultPageFactory;
public function __construct(Context $context, \Magento\Framework\View\Result\PageFactory $resultPageFactory)
$this->_resultPageFactory = $resultPageFactory;
public function execute()
$resultPage = $this->_resultPageFactory->create();
return $resultPage;

In Magento 1 each controller can have multiple actions, but in Magento 2 this is not the case. In Magento 2 every action has its own class which implements the execute() method.

STEP 3 : Create BlockHere, we will create a simple Helloworld.php block file inside
app/code/Pragmatic/Helloworld/Block folderwith following code:

namespace Pragmatic\Helloworld\Block;
class Helloworld extends \Magento\Framework\View\Element\Template
public function getMessage()
return 'Hello World!';

In this block file, we have created a getMassage() method which will return a message ‘Hello World!’.

STEP 4 : Create Layout and Template file
We have seen in Magento 1.x layout file and template files are placed in a separate app/design/ folder, but in Magento 2 it is placed inside a new view folder which is placed in module folder only. Inside this we can have three folders namely: adminhtml, base or frontend.

The adminhtml folder is used for admin, the frontend folder is used for frontend and the base folder is used for both, admin & frontend files.

Here we will first create a helloworld_index_index.xml layout file inside app/code/Pragmatic/Helloworld/view/frontend/layout folder with following code:

<page xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:noNamespaceSchemaLocation="../../../../../../../lib/internal/Magento/Framework/View/Layout/etc/page_configuration.xsd" layout="1column">
<referenceContainer name="content">
<block class="Pragmatic\Helloworld\Block\Helloworld"
name="helloworld" template="helloworld.phtml" />

In our layout file, we have created a block inside content container and set the template file as helloworld.phtml file. Now we will create a template file inside
app/code/Pragmatic/Helloworld/view/frontend/template folder with following code:

<h1><?php echo $this->getMessage(); ?></h1>

$this variable is referencing our block class and we are calling the method getMessage() which is returning the string ‘Hello world!’.

Now, open your browser. Hit your yourdomain.com/helloworld/index/index. You can see the message as below:


Magento 2 Improvements over Magento 1

Magento is one of the top leading Open-Source Digital Commerce Platform for an online store. Magento is used by both small scale industries and large scale industries to open their online store and handle their store. Magento is the best Content Management System(CMS) to develop an ecommerce website and manage their store. Magento is getting improved day by day, because of the hard work and high level analysis of magento team and thousands of developers who are showing their interest in it and developing more functional & innovative extensions and themes to make magento more easy to use.

Now, magento has released a complete new version which they named as Magento 2. Magento 2 is totally different from magento in terms of new features, structure flow, coding, view and many more. In magento 2, team has tried to improve all the complaints of developers who are facing issues in magento previous version. The Key features that are improved in magento 2 are:

Improved Performance:

Performance was the major issue which a developer faces when it comes time to deploy a heavy ecommerce website using magento. Because the performance of magento (till 1.9.x) was slow. But in Magento 2 they have worked a lot in it and improved the performance of Magento 2 by an average of 20%. Additions on Magento 2:
  • Magento 2 fully supports PHP 5.6 and PHP 7. PHP 7 is now released with completely new performance and much faster executions.
  • Nginx has been developed with three core principles in mind: high performance, high concurrency, and low usage of memory. Thus, it significantly increases the speed, and Magento 2 will fully support it.
  • Redis is an advanced key-value cache which provides top notch performance and offers other features. It incorporates several use cases for in-memory datasets, which are the reason of high performance results.
  • Full page caching is so powerful due to the ability to store the full page output in a cache. As a result, subsequent page loads do not require much server load. Thus, full page caching is mandatory for high-traffic websites, and Magento 2 will fully support it.

User-friendly Checkout:

Checkout is one of the most important part of an ecommerce website. If the checkout process is not good then the stores have more chances to move their products to abandon cart which reduces the store’s sales. Instead of an accordion layout that’s used in older versions of Magento, the steps are now located across the top. The overall design is much clearer and concise, and easier to navigate through. Customers need to have a stress-free and efficient checkout experience, and Magento 2 provides just that.
  • Reduced the number of checkout steps and also reduce the information details to be filled by customers.
  • Integration with popular payment gateways (PayPal, Braintree, Authorize.net, WorldPay (Enterprise Edition), CyberSource (Enterprise Edition)) that weren’t previously supported. Variety of payment options provide better checkout UX for your customers.
  • Automatic guest checkout.
  • Order summary with thumbnail.
  • Receiving shipping rates once information is provided.

User-friendly Administration Panel:

The main administration panel has been redesigned and is now displayed vertically on the left hand side of the admin. The menu icons are very prominent and this is because the entire navigation is touch friendly and responsive. This improvement should be welcomed and should definitely help those merchants who spend their days running around with an iPad in hand. Changes in administration panel are:

  • UI enhancements providing responsive and touch friendly navigation. Let the good times roll for merchants who works with CMS via iPads or tablet.
  • Features are well categorised and managed under the menu in comparison to previous magento version.
  • Grid view is made configurable, so that admin can select the attributes to be shown in it. Before, we need to hard to add any new column in grid.

  • Addition of new menu element “Marketing”. All the marketing related features are listed inside this menu.

Easy Product Creation and Configuration:

In previous magento version, product creation was a long process specially for configurable products. Now, product creation becomes more easier and more customizable from one place.Enhancement are:
  • The product creation process in Magento 2 has been streamlined and it becomes more easier to create any type of product.
  • Product attribute set has been modified. We can choose an attribute set while creating product and we can change it on from the same place.
  • Now, you can add video as product images and in description.

  • We can show the configurable options with its value. Eg: we can show the actual color in place of dropdown.

Quarterly Platform Updates:

Magento 2 will release new features for both Community and Enterprise Editions every three months.


Magento 2 maintains the flexibility for which magento is known in ecommerce development. Businesses will move from magento 1 to magento 2 and from other ecommerce platform to magento 2, because of all the improvements which are mentioned above and other improvements. Experts believes that business will grow more with all the additional and improved functionalities of Magento 2 like: better performance, small and streamlined checkout process which make customers to come back to your store.

Magento 1 will be officially supported for the next 3 years from Magento 2 launching (Dec 2015), so you may take your time to migrate your Magento store to Magento 2. If you have a middle-sized store don’t hurry up to update. Just wait while the usual extensions for payment gateways and delivery options are being tested on larger scale retailers.

Although Magento 2 have better UI experience for customers & admin and have many advantages and disadvantages, but it definitely will be expensive to setup and need more expertise to setup this platform.

If you need any kind of assistance or development for Magento 2, We have a team of Magento developers. Contact us with your requirements. >>>

InstaPics - Instagram Magento Extension

InstaPics is a magento extension to showcase instagram images in your online store. It will help you to fetch instagram images to your online by using #hashtag or by #UserDetails.

Social commerce is one of the biggest commerce trends from last few years. Nowadays, social media is the biggest part of everyone's day-to-day life. And Instagram is one of the most popular social media among all age groups especially among the youngster. One can increase the market value of their products by advertising the use of their products by their customers. This will surely increase the customer’s interest for your products and in some ways it will also help your customers to learn more about the ways to use the products.

Instagram images can be displayed anywhere throughout the website in a complete responsive layout. Images can be shown in either of the two fashions. They are:
  • Grid Layout: In this layout you can configure the number of images to show in a grid, size of the grid, how to fetch the images(Either by #hashtag or by #UserDetails), etc.
    Product_slider Grid View.jpg
  • Slider Layout: In this layout you can configure the complete slider - no. of images, blocks to show in one slide, pagination, movement, margin, how to fetch the images (Either by #hashtag or by #UserDetails), etc.

One can insert their instagram images in any block or section or page of the their website. It will be fully responsive and can be mould in required size according to its container. It can be inserted in either ways:


Pragmatic has provided a connection with the social feeds of Facebook and Twitter In Odoo

With Odoo 9 Social Feed module, Pragmatic has provided a connection with the social feeds of Facebook and Twitter. The feeds from both the sites are combined and made available to the user in chronological order. Also, there is a facility of custom odoo posts which can be added to them.
  • Before starting with this module, the user needs to specify the account names in the configuration from which the feeds are to be fetched.
  • Feeds from multiple accounts can also be fetched.
  • The feeds related to the specified accounts of both facebook and twitter are fetched and then arranged in the ascending order of time elapsed and displayed to the user.
Facebook Feeds
  • To identify that a particular post is from facebook, a facebook icon is shown on top of that post.
  • Name of the account from which that post is fetched is also shown beside the icon. This can be helpful in case multiple accounts are specified.
Twitter Feeds
  • Similarly, to identify that a particular post is from twitter, a twitter icon is shown on top of that post.
  • Name of the account from which that post is fetched is also shown beside the icon. This can be helpful in case multiple accounts are specified.
Custom Feeds
  • There is also a facility to send custom feeds (posted by the odoo users).
  • The users can post the company related feeds, best wishes, or any other newsfeeds internally which will be available to other odoo users.
  • Internal posts are to be written in the textbox given at the bottom of the page.
  • Also, you can attach an image to the text message, which will be describing your message.
  • The image icon beside the textbox allows you to select the images available on your local machine.
  • User has to upload the image, which will then get added in the textbox.
  • After the message is completed, click on the Post button.
  • Once, clicked on the Post button, the message will be posted above along with the time ago tag.
Source page of the feed
  • Complete feed of facebook and twitter are not displayed here as it may occupy lots of space. Hence, the 'read more..' option is provided.
  • The 'read more..' option is given at the end of every feed for viewing the it in detail.
  • On click of the link the user will be redirected to the source page of that feed.

Odoo 10 Community Edition on AWS Marketplace

Pragmatic has launched Odoo 10 on the Amazon Cloud Marketptace, With a click on a button you can launch an instance of Odoo 10 Community Edition

Odoo 10 Community Edition Features


  • Event Barcode
  • Email Schedule : Easy to Followup



  • HR Attendance (Kiosk Mode)
  • Print Employee Badge
  • Sign In/Out authenticate using Pin
  • Timesheet Apps : usability Improvement


  • POS Serial Number
  • POS Default Cash Control
  • POS Restaurant : Transfer order from one table to another


  • Option in Payment Gateway to auto confirm SO
  • Out of stock warning in option
  • Delivery : choose the package when clicking on put in pack
  • Inventory : Can apply Quality control on picking
  • Serial Number Upstream Tracebility
  • Delivery Order : Add margin in % to cover lost
  • Picking : Up/Down Tracebility

Website/ Ecommerce:

  • B2B/B2C
  • New checkout design for address selection
  • Ecommerce : Add multiple images of products
  • Improved Portal Frotend view
  • Easy to set website Favicon
  • Ecommerce insight: Save, manage, and reuse credit cards. Authorize amount at checkout and capture at shipping.
  • Ecommerce user can pay through stored card
  • Easy to trace Website Orders, Invoices

Expense :

  • Accountant can direct pay the Expense
  • Email Alias to direct record Expense (based on expense internal reference, system identify product and create expense accordingly)

General / Discuss :

  • history of chatter clickable (source document)
  • The debug mode does not split the web assets by default
  • Keyboard shortcuts detailed on the top right menu from the home page
  • Easy to maintain user access (set default access to default user)
  • Search date ranges quickly with the new in-between operator.
  • Canned responses and /commands in discuss
  • Create in one click
  • Company setting of apps moved to Apps > Setting
  • Any HTML type report easily edit in app view => https://drive.google.com/file/d/0B21cUNlAdZ6gWHl5NUE4b0lqc0k/view?usp=drivesdk


  • Easy to Create new Apps
  • Easy to add new field in either form view or tree view
  • Channge string, help message, views, reports,....

Purchase :

  • Editable PO: Easy to edit confirmed PO
  • Purchase Teander : Blanket order Type


  • Project : project dashboard is now based on user's favorite
  • Project : Easy to maintain Sub-Task
  • Forecast : Grid : By User, By Project


  • Subscription dashboard by company, Tag, Contract
  • Subscription : New Cohort Analysis

Helpdesk Management:

Easy to assign tickets by different Assignation Method
  • Randomly
  • Manually
  • Balanced
Generate Tickets
  • Email Alias
  • Live Chat
  • Ticket Form
  • External API
  • SLA Pollicies
  • Rating
  • Canned Response
  • Forum- Help Center
  • Slides- eLearning


  • PLM
  • MPS
  • Maintenance
  • Quality
  • Easy to know "Overall Equipement Effectivness"
  • Unbuild Order , Scrap Products

Odoo 10 Helpdesk Management

Helpdesk is new module introduced with release of Odoo 10. This module help maintaining helpdesk of the company with various features listed below.

Create ticket using multiple channels
  • User can create tickets manually or configure it so that incoming email create tickets or using web site form to create tickets. Third party application can also be conneted via API and tickets can be created using web srevice.

Ticket status can be tracked by new, in progress, solved or cancelled stages.

Priority is assigned to tickets such as low priority, high priority and urgent.
Assign ticket to user
  • Help desk team to be configured to handle tickets generated. User can define helpdesk team members and assign tickets to them by manually, randomly and balanced methods.

SLAs (Service Level Agreements)
  • Configure service level agreements and automate related checks and actions by SLA policies such as time required for urgent tickets.

Dashboard is showing various statistics for helpdesk such as average open hours, SLA complied or not, priority of tickets, performance (actual vs trageted.) and success rate.

Odoo 10 Manufacturing MRP Enterprise Edition v/s Odoo 10 Manufacturing MRP Community Edition Features Comparison

Features / AppsOdoo 10 CommunityOdoo 10 Enterprise
Manufacturing Orders
Work Orders
Plant Floor Dashboard×
Work Center Planning×
Master Production Schedule×
Preventive and Corrective Maintenance ×
Barcode Interface×
Tablet and Mobile Support×
KPIs Statistics and Dashboards×

Odoo on AWS Cloud

What is Odoo?

Odoo is Open Source Business Application. Modern Software for Smart Businesses. Boost your sales, step up productivity and manage all day-to-day activities. Fully integrated, simple and mobile. Odoo is flexible and evolving.


  • Business suite of applications ERP, CRM, HRM and more.
  • Open source.
  • Low cost.
  • Provide standard application.
  • Modularity.
  • Easy to make tricks and improve features.
  • Grew fastly and keeps growing unstoppably.
  • Web based.
  • Fully integrated.

Odoo Features

Odoo is now the All-in-one business software

Common Challenges faced while Deploying and Hosting Odoo

Odoo Deploying On Cloud

Pragmatic Odoo provides a one-click install solution for Odoo. Run your own Odoo server in the cloud. Odoo isn’t just one application. It’s hundreds. Odoo is an enterprise resource platform from which you can manage all your business operations – from supply chain and project management, to accounting and HR. Out of the box, Odoo includes messaging, sales CRM and reporting modules. Click the settings tab and you’ll be presented with nearly 2000 other modules – bug tracking, project management, timesheets, MRP, recruiting, calendar, warehouse management, and much, much more – that can be deployed with one click.

IPaas Scalable, Reliable, High Performance Platform

Pragmatic has powered Odoo on Amazon Web Services (AWS) which provides a highly reliable, scalable, low-cost infrastructure platform in the fast growing cloud spectrum that fuels millions of businesses trans globally.
Deploy your enterprise apps using Odoo on AWS and enjoy the benefits of a low-cost, pay-as-you-go pricing and elastic capacity on a global cloud infrastructure with data centers around the globe.

Features of Odoo iPaas

We have 4 plans

Single Server Odoo 9 @AWS-Cloud Platform-Apricot Multi Server Odoo 9 @AWS-Cloud Platform-Orange Multi Server Odoo 9 @AWS-Cloud Platform-Apple Multi Server Odoo 9 @AWS-Cloud Platform-Mango
For a start - the best startup
For daily use with high benefits
The best of both worlds
The King of indulgence - Live life King Size
Best Suited: Startups, developers
Best Suited: Small Business
Best Suited: Mid-sized Business
Best Suited: Large Enterprises
Go in detail of each plan with features
High Performance Operation Heap
High Performance Operation Heap
High Performance Operation Heap
High Performance Operation Heap
1 Odoo Server and Postgres Database
1 Odoo Server
1 Load balancer
Application Firewall
PHPPg Admin
1 Database Server
2 Odoo server
1 Load balancer
Highly secure & reliable
Database & System Images
1 Postgres Database Server
Auto Scaling (Webserver)
Odoo Custom Modules
PHPPg Admin
Redis Cache
Redis Cache
Odoo 9 Community Edition
Highly Secure & Reliable
Database & System Images
1 Postgres Database Server
Odoo Custom Modules
PHPPg Admin
Database & System Images
Highly Secure & Reliable
PHPPg Admin


HIPAA Compliance with AWS

AWS HIPAA Compliance

Amazon Web Services (AWS) to create HIPAA (Health Insurance Portability and Accountability Act)-compliant applications. HIPAA Privacy and Security Rules for protecting Protected Health Information (PHI).

HIPAA and HITECH impose requirements related to the use and disclosure of PHI, appropriate safeguards to protect PHI, individual rights, and administrative responsibilities.

Covered entities and their business associates can use the secure, scalable, low-cost IT provided by Amazon Web Services (AWS) to architect applications in alignment with HIPAA and HITECH compliance requirements. AWS services and data centers have multiple layers of operational and physical security to help ensure the integrity and safety of customer data. AWS, offers a standardized Business Associate Addendum (BAA) for such customers. AWS service in an account designated as a HIPAA Account, but they may only process, store and transmit PHI using the HIPAA-eligible services defined in the AWS BAA.

Amazon Web Services which are HIPAA Compliant

  • Amazon DynamoDB
  • Amazon Elastic Block Store (Amazon EBS)
  • Amazon Elastic Compute Cloud (Amazon EC2)
  • Elastic Load Balancing
  • Amazon Elastic MapReduce (Amazon EMR)
  • Amazon Glacier
  • Amazon Redshift
  • Amazon Relational Database Service (Amazon RDS) for MySQL
  • Amazon RDS for Oracle
  • Amazon Simple Storage Service (Amazon S3)

HIPAA architectures on AWS

AWS provides multiple services to deploy a highly available, scalable, secure application stack, which can serve a limitless variety of healthcare applications and use cases. In this blog, we will embark on a journey into HIPAA-eligible architectures by scoping the discussion to the following deployment diagram, which can be adopted as a starting point for building a HIPAA-eligible, web-facing application.

The underlying theme to this architecture is encryption everywhere.


1) Obtain a Business Associate Agreement with AWS

Once you have determined that storing, processing, or transmitting protected health information (PHI) is absolutely necessary, before moving any of this data to AWS infrastructure you must contact AWS and make sure you have all the necessary contracts and a Business Associate Agreement (BAA) in place. These contracts will serve to clarify and limit, as appropriate, the permissible uses and disclosures of protected health information.

2) Authentication and Authorization

The authentication and authorization mechanisms you define for your HIPAA-eligible system must be documented as part of a System Security Plan (SSP) with all roles and responsibilities documented in detail along with a configuration control process that specifies initiation, approval, change, and acceptance processes for all change requests. Although the details of defining these processes won’t be discussed here, the AWS Identity and Access Management (AWS IAM) service does offer the granular policies required for achieving the necessary controls under HIPAA and HITECH.
enable multi-factor authentication (MFA) on your AWS root account and lock away the access keys
I AM account that has significant privileges in your AWS account

3) Web and Application Layers

DNS resolution is relatively straightforward and can be achieved using Amazon Route 53. Just be sure not to use any PHI in the URLs.
Amazon Elastic Load Balancer Configuration
The primary entity that receives the request from Amazon Route 53 is an Internet-facing Elastic Load Balancer. There are multiple ways in which an ELB load balancer can be configured, as explained here. To protect the confidential PHI data, you must enable secure communication options only, like HTTPS-based or TCP/SSL-based end-to-end communication. Although you can use TCP/SSL pass-through mode on the ELB load balancer for your web tier requests, using this option limits the use of some of the HTTP/HTTPS specific features like sticky sessions and X-Forward-For headers. For this reason, many startups prefer to make use of HTTPS-based communication on ELB, as shown in the following screenshot.

As shown in the configuration, there’s a single listener configured that accepts HTTPS requests on port 443 and sends requests to back-end instances using HTTPS on port 443. Because HTTPS is used for the front-end connection, you must create the certificate as per your publicly accessible domain name, get the certificate signed by a CA (for an internal load balancer you can use a self-signed certificate as well), and then upload the certificate using AWS IAM, which manages your SSL certificates, as explained in the ELB documentation. This certificate is then utilized to decrypt the HTTPS-based encrypted requests that are received by the ELB load balancer.

To route the requests from the ELB load balancer to the back-end instances, you must use back-end server authentication so that the communication is encrypted throughout. You can enable this by creating a public key policy that uses a public key for authentication. You use this public key policy to create a back-end server authentication policy. Finally, you enable the back-end server authentication by setting the back-end server authentication policy with the back-end server port, which in this case would be 443 for an HTTPS protocol. For an example of how to set this up easily using OpenSSL, check out the ELB documentation and Apache Tomcat’s documentation on certificates.

WAF/IDS/IPS Layer Many of our customers make use of an extra layer of security (like web application firewalls and intrusion detection/prevention solutions) in front of their web layer to avoid any potential malicious attacks to their sensitive applications. There are multiple options available in the AWS Marketplace to provision tools like WAF/IDS/IPS, etc. So you could start from there instead of setting it up from scratch on an EC2 instance.

Web Layer
The next layer is the web tier, which could be auto-scaled for high availability and placed behind an internal ELB load balancer with only a HTTPS listener configured. To further secure the access to web servers, you should open up your web server instance's’ security group to accept requests only from the designated load balancer, as shown in the following diagram.

App Layer
Encryption of traffic between the web layer and app layer will look similar to the setup in the preceding diagram. Again, there will be an internal ELB load balancer with HTTPS listener configured. On the application servers, SSL certificates are set up to keep the communication channel encrypted end-to-end.
Both the app and web layers should also be in private subnets with auto-scaling enabled to ensure a highly responsive and stable healthcare application.

4) Database Layer
The easiest way to get started with database encryption is to make use of Amazon RDS (MySQL or Oracle engine). To protect your sensitive PHI data, you should consider the following best practices for Amazon RDS:

  • You should have access to the database enabled only from the application tier (using appropriate security group/NACL rules).
  • Any data that has the potential to contain PHI should always be encrypted by enabling the encryption option for your Amazon RDS DB instance, as shown in the following screenshot. Data that is encrypted at rest includes the underlying storage for a DB instance, its automated backups, read replicas, and snapshots.

  • For encryption of data in-transit, MySQL provides a mechanism to communicate with the DB instance over an SSL channel, as described here. Likewise, for Oracle RDS you can configure Oracle Native Network Encryption to encrypt the data as it moves to and from a DB instance.
  • For encryption of data at rest, you could also make use of Oracle’s Transparent Data Encryption (TDE) by setting the appropriate parameter in the Options Group associated with the RDS instance. With this, you can enable both TDE tablespace encryption (encrypts entire application tables) and TDE column encryption (encrypts individual data elements that contain sensitive data) to protect your PHI data. You could also store the Amazon RDS Oracle TDE Keys by leveraging AWS CloudHSM, a service that provides dedicated Hardware Security Module (HSM) appliances within the AWS cloud. More details on this integration are available here.
    For additional discussion on Amazon RDS encryption mechanisms, please refer back to the whitepaper.

5) Backup/Restore
To protect your patient data, you should be vigilant about your backup and restore processes. Most AWS services have mechanisms in place to perform backup so that you can revert to a last known stable state if any changes need to be backed out. For example, features like EC2 AMI creation or snapshotting (as in the Amazon EBS, Amazon RDS, and Amazon Redshift services) should be able to meet the majority of backup requirements.
You can also make use of third-party backup tools, which integrate with Amazon S3 and Amazon Glacier to manage secure, scalable, and durable copies of your data. When using Amazon S3, you have multiple ways to encrypt your data at rest and can leverage both client-side encryption and server-side encryption mechanisms. Details on these options are available in the Amazon S3 documentation. PHI in S3 buckets should always be encrypted. You can also enforce the server-side encryption (SSE) option on any of the buckets by adding the following condition to your Amazon S3 bucket policy:

“Condition”: {
“StringEquals”: {
“Bool”: {
“aws:SecureTransport”: “true”
For security of data in transit, you should always use Secure Sockets Layer (SSL) enabled endpoints for all the services, including Amazon S3 for backups. If you are enabling backup of your data from the EC2 instances in a VPC to Amazon S3, then you could also make use of VPC endpoints for Amazon S3. This feature creates a private connection between your private VPC and Amazon S3 without requiring access over the Internet or a NAT/proxy device.

6) EC2 and EBS requirements

Amazon EC2 is a scalable, user-configurable compute service that supports multiple methods for encrypting data at rest, ranging from application-level or field-level encryption of PHI as it is processed, to transparent data-encryption features of commercial databases, to the use of third-party tools. For a more complete discussion of the options, see the whitepaper.
In the next example, we show you a simple approach to architecting HIPAA-eligible web servers.
First, you must be sure that your EC2 instance is running on hardware that is dedicated to a single customer by using a dedicated instance. You can do this by setting the tenancy attribute to “dedicated” on either the Amazon VPC that the instance is launched in, the Auto-Scaling Launch Configuration, or on the instance itself, as shown in the following screenshot.

Because Amazon Elastic Block Store (Amazon EBS) storage encryption is consistent with HIPAA guidance at the time of this blog writing, the easiest way to fulfill the at-rest encryption requirement is to choose an EC2 instance type that supports Amazon EBS encryption, and then add the encrypted EBS volume to your instance. (See the EBS link for a list of instance types.)

You should keep all of your sensitive PHI data on the encrypted EBS volumes, and be sure never to place PHI on the unencrypted root volume.
You might want to take some additional precautions to ensure that the unencrypted volume does not get used for PHI. For example, you can consider a partner solution from the AWS Marketplace, which offers full-drive encryption to help you feel more at ease. This will help to ensure that if there ever is a program (such as a TCP core dump) that uses the root drive as temporary storage or scratch space without your knowledge, it will be encrypted. Other startups have developed their own techniques for securing the root volume by using Logical Volume Management (LVM) to repartition the volume into encrypted segments and to make other portions read-only.

7) Key Management

At every turn in this architecture, we have mentioned encryption. Ensuring end-to-end encryption of our PHI is an essential component of keeping our data secure. Encryption in flight protects you from eavesdroppers, and encryption at rest defends against hackers of the physical devices. However, at some point we do need to open this ciphertext PHI in order to use it in our application. This is where key management becomes a “key” piece of the implementation (pun intended).
AWS does not place limitations on how you choose to store or manage your keys. Essentially, there are four general approaches to key management on AWS:

  1. Do it yourself
  2. Partner solutions
  3. AWS CloudHSM
  4. AWS KMS

A full discussion (or even a good starting discussion) on key management far exceeds what we can provide in a single blog entry, so we will just provide some general advice about key management as it relates to HIPAA.
The first piece of advice is that you should strongly consider the built-in AWS option. All of the checkbox encryption methods — such as Amazon S3 server-side encryption, Amazon EBS encrypted volumes, Amazon Redshift encryption, and Amazon RDS encryption make it very easy to keep your PHI encrypted and you should explore these options to see if these tools meet your BAA requirements and HHS guidance. These methods automate or abstract many of the tasks necessary for good key maintenance such as multifactor encryption and regular key rotation. AWS handles the heavy lifting and ensures that your encryption methods are using one of the strongest block ciphers available.
If you need to create a separation of duties between staff that maintain the keys vs. developers who work with the keys, or if you would simply like additional control of your keys and want to be able to easily create, control, rotate and use your encryption keys then you should look at using the Amazon Key Management Service (KMS). This service is still integrated with AWS SDKs and other AWS services like AWS CloudTrail, which can help provide auditable logs to help meet your HIPAA compliance requirements.
If you need additional controls beyond what is provided by AWS, you should be sure that you have proper security experts who can ensure the safe management of your encryption keys. Remember, a lost key could render your entire dataset useless, and AWS Support will not have any way to help a problematic situation.
For more on encryption and key Management in AWS, check out this video from last year’s re:Invent, and read the Securing Data at Rest with Encryption whitepaper.

8) Logging and Monitoring

Logging and monitoring of system access will play a starring role in your HIPAA-eligible architecture. The goal is to put auditing in place to allow security analysts to examine detailed activity logs or reports to see who had access, IP address entry, what data was accessed, etc. The data should be tracked, logged, and stored in a central location for extended periods of time in case of an audit.
At the AWS account level, be sure to launch AWS CloudTrail and immediately start recording all AWS API calls. You should also launch AWS Config, which will provide you with an AWS resource inventory, configuration history, and configuration change notifications.
You will also need to monitor and maintain the logs of your AWS resources for keeping a record of system access to PHI as well as running analytics that could serve as part of your HIPAA Security Risk Assessment. One way to do this is with AWS CloudWatch, a monitoring service that you can use to collect server logs from your EC2 instances as well as logs from the Amazon RDS DB instance, Amazon EBS volumes, and the ELB elastic load balancer. You can even develop custom metrics to obtain the necessary log information from your own applications.
CloudWatch has other useful features:

  • View graphs and statistics on the console
  • Set up alarms to automatically notify you of abnormal system behavior
  • Capture network traffic in a single repository through the integration of CloudWatch with VPC Flow Logs

With all these logging mechanisms, you want to be sure that no PHI is actually stored in the logs. This usually requires some special attention. For example, sometimes you might need to encrypt PHI in your custom metric before sending to AWS CloudTrail. You also should be aware of everything that is coming into the logs. For example, the combination of session user and IP address coming from the ELB logs is considered PHI in some situations, so you should catch these special circumstances to be sure PHI is fully scrubbed from the logs.
Finally, Amazon S3 is a fantastic repository for all these logs. However, take extra precautions to lock down the permissions for log access of these highly sensitive data sets. You might want to consider some more stringent access requirements such as requiring multi-factor authentication to read the logs, turning on versioning to retain any logs that get deleted, or even setting up cross-region replication to keep a second copy of the logs in an entirely different AWS account.

AWS Environment

AWS environment to meet your HIPAA Compliance needs, but we will also provide ongoing managed services 24/7 to help ensure that your AWS environment remains HIPAA Compliant. Our HIPAA Compliance Support Plan for AWS includes a comprehensive suite of security and support features designed to specifically address the HIPAA and HITECH standards, including the necessary levels of encryption within AWS.

Encryption and Protection of PHI in AWS

implement encryption, customers may evaluate and take advantage of the encryption features native to the HIPAA-eligible services or they can satisfy the encryption requirements through other means consistent with the Guidance. The following sections provide high-level details about using available encryption features in each of the HIPAA-eligible services and other patterns for encrypting PHI. A final section describes how AWS KMS can be used to encrypt the keys used for encryption of PHI on AWS.

Amazon EC2

Amazon EC2 is a scalable, user-configurable compute service that supports multiple methods for encrypting data at rest.
For example, customers might select to perform application- or field-level encryption of PHI as it is processed within an application or database platform hosted in an Amazon EC2 instance.
Approaches range from encrypting data using standard libraries in an application framework such as Java or .NET; leveraging Transparent Data Encryption features in Microsoft SQL or Oracle; or by integrating other third-party and software as a service (SaaS)-based solutions into their applications. Customers can choose to integrate their applications running in Amazon EC2 with AWS KMS SDKs, simplifying the process of key management and storage. Customers can also implement encryption of data at rest using file-level or full disk encryption (FDE) by utilizing third-party software from AWS Marketplace Partners or native file system encryption tools (such as dm-crypt, LUKS, etc.).

Network Control
Network traffic containing PHI must encrypt data in transit. For traffic between external sources (such as the Internet or a traditional IT environment) and Amazon EC2, customers should use industry-standard transport encryption.

Mechanisms such as TLS or IPsec virtual private networks (VPNs), consistent with the Guidance. Internal to an Amazon Virtual Private Cloud (VPC) for data traveling between Amazon EC2 instances, network traffic containing PHI must also be encrypted; most applications support TLS or other protocols providing in-transit encryption that can be configured to be consistent with the Guidance. For applications and protocols that do not support encryption, sessions transmitting PHI can be sent through encrypted tunnels using IPsec or similar implementations between instances.

Amazon EC2 instances that customers use to process, store, or transmit PHI are run on Dedicated Instances, which are instances that run in an Amazon VPC on hardware dedicated to a single customer. Dedicated Instances are physically isolated at the host hardware level from instances that are not Dedicated Instances and from instances that belong to other AWS accounts. For more information on Dedicated Instances, see

Customers can launch Amazon EC2 Dedicated Instances in several ways:

  • Set the tenancy attribute of an Amazon VPC to “dedicated” so that all instances launched into the Amazon VPC will run as Dedicated Instances
  • Set the placement tenancy attribute of an Auto-Scaling Launch Configuration for instances launched into an Amazon VPC
  • Set the tenancy attribute of an instance launched into an Amazon VPC

Amazon Virtual Private Cloud offers a set of network security features well-aligned to architecting for HIPAA compliance. Features such as stateless network access control lists and dynamic reassignment of instances into stateful security groups afford flexibility in protecting the instances from unauthorized network access. Amazon VPC also allows customers to extend their own network address space into AWS, as well as providing a number of ways to connect their data centers to AWS. VPC Flow Logs provide an audit trail of accepted and rejected connections to instances processing, transmitting or storing PHI. For more information on Amazon VPC, see http://aws.amazon.com/vpc/.

Amazon Elastic Block Store

Amazon EBS encryption at rest is consistent with the Guidance that is in effect at the time of publication of this whitepaper. Because the Guidance might be updated, customers should continue to evaluate and determine whether Amazon EBS encryption satisfies their compliance and regulatory requirements. With Amazon EBS encryption, a unique volume encryption key is generated for each EBS volume; customers have the flexibility to choose which master key from the AWS Key Management Service is used to encrypt each volume key. For more information.

Amazon Redshift

Amazon Redshift provides database encryption for its clusters to help protect data at rest. When customers enable encryption for a cluster, Amazon Redshift encrypts all data, including backups, by using hardware-accelerated Advanced Encryption Standard (AES)-256 symmetric keys. Amazon Redshift uses a four-tier, key-based architecture for encryption. These keys consist of data encryption keys, a database key, a cluster key, and a master key. The cluster key encrypts the database key for the Amazon Redshift cluster. Customers can use either AWS KMS or an AWS CloudHSM (Hardware Security Module) to manage the cluster key. Amazon Redshift encryption at rest is consistent with the Guidance that is in effect at the time of publication of this whitepaper. Because the Guidance might be updated, customers should continue to evaluate and determine whether Amazon Redshift encryption satisfies their compliance and regulatory requirements.

Amazon S3

Customers have several options for encryption of data at rest when using Amazon S3, including both server-side and client-side encryption and several methods of managing keys.

Amazon Glacier

Amazon Glacier automatically encrypts data at rest using AES 256-bit symmetric keys and supports secure transfer of customer data over secure protocols.
Connections to Amazon Glacier containing PHI must use endpoints that accept encrypted transport (HTTPS). For a list of regional endpoints.

Amazon RDS for MySQL

Amazon RDS for MySQL allows customers to encrypt MySQL databases using keys that customers manage through AWS KMS. On a database instance running with Amazon RDS encryption, data stored at rest in the underlying storage is encrypted consistent with the Guidance in effect at the time of publication of this whitepaper, as are automated backups, read replicas, and snapshots. Because the Guidance might be updated, customers should continue to evaluate and determine whether Amazon RDS for MySQL encryption satisfies their compliance and regulatory requirements. For more information on encryption at rest using Amazon RDS.

Amazon RDS for Oracle

Customers have several options for encrypting PHI at rest using Amazon RDS for Oracle. Customers can encrypt Oracle databases using keys that customers manage through AWS KMS. On a database instance running with Amazon RDS encryption, data stored at rest in the underlying storage is encrypted consistent with the Guidance in effect at the time of publication of this whitepaper, as are automated backups, read replicas, and snapshots. Because the Guidance might be updated, customers should continue to evaluate and determine whether Amazon RDS for Oracle encryption satisfies their compliance and regulatory requirements. For more information on encryption at-rest using Amazon RDS, see

Elastic Load Balancing

Customers may use Elastic Load Balancing to terminate and process sessions containing PHI. Customers may choose either the Classic Load balancer or the Application Load Balancer. Because all network traffic containing PHI must be encrypted in transit end-to-end, customers have the flexibility to implement two different architectures:

Customers can terminate HTTPS, HTTP/2 over TLS (for Application) , or SSL/TLS on Elastic Load Balancing by creating a load balancer that uses an encrypted protocol for connections. This feature enables traffic encryption between the customer’s load balancer and the clients that initiate HTTPS , HTTP/2 over TLS, or SSL/TLS sessions, and for connections between the load balancer and customer back-end instances. Sessions containing PHI must encrypt both front-end and back-end listeners for transport encryption. Customers should evaluate their certificates and session negotiation policies and maintain them consistent to the Guidance. For more information.

Amazon EMR

Amazon EMR deploys and manages a cluster of Amazon EC2 instances into a customer’s account. All Amazon EC2 instances that process, store, or transmit PHI must be Dedicated Instances. In order to meet this requirement, EMR clusters must be created in a VPC with tenancy attribute of “dedicated.” This ensures that all cluster nodes (instances) launched into the VPC will run as Dedicated Instances.

Amazon DynamoDB

Connections to Amazon DynamoDB containing PHI must use endpoints that accept encrypted transport (HTTPS). For a list of regional endpoints, see http://docs.aws.amazon.com/general/latest/gr/rande.html#ddb_region.

PHI stored in Amazon DynamoDB must be encrypted at-rest consistent with the Guidance. Amazon DynamoDB customers can use the application development framework of their choice to encrypt PHI in applications before storing the data in Amazon DynamoDB. Alternatively, a client-side library for encrypting content is available from the AWS Labs GitHub repository. Customers may evaluate this implementation for consistency with the Guidance. For more information, see https://github.com/awslabs/aws-dynamodb-encryption-java. Careful consideration should be taken when selecting primary keys and when creating indexes such that unsecured PHI is not required for queries and scans in Amazon DynamoDB.

Using AWS KMS for Encryption of PHI

Master keys in AWS KMS can be used to encrypt/decrypt data encryption keys used to encrypt PHI in customer applications or in AWS services that are integrated with AWS KMS. AWS KMS can be used in conjunction with a HIPAA account, but PHI may only be processed, stored, or transmitted in HIPAA-eligible services. KMS does not need to be a HIPAA-eligible service so long as it is used to generate and manage keys for applications running in other HIPAA-eligible services. For example, an application processing PHI in Amazon EC2 could use the GenerateDataKey API call to generate data encryption keys for encrypting.

and decrypting PHI in the application. The data encryption keys would be protected by customer master keys stored in AWS KMS, creating a highly auditable key hierarchy as API calls to AWS KMS are logged in AWS CloudTrail.

Auditing, Back-Ups, and Disaster Recovery

HIPAA’s Security Rule also requires in-depth auditing capabilities, data back-up procedures, and disaster recovery mechanisms. The services in AWS contain many features that help customers address these requirements.

In designing an information system that is consistent with HIPAA and HITECH requirements, customers should put auditing capabilities in place to allow security analysts to examine detailed activity logs or reports to see who had access, IP address entry, what data was accessed, etc. This data should be tracked, logged, and stored in a central location for extended periods of time, in case of an audit. Using Amazon EC2, customers can run activity log files and audits down to the packet layer on their virtual servers, just as they do on traditional hardware. They also can track any IP traffic that reaches their virtual server instance. A customer’s administrators can back up the log files into Amazon S3 for long-term reliable storage.

Under HIPAA, covered entities must have a contingency plan to protect data in case of an emergency and must create and maintain retrievable exact copies of electronic PHI. To implement a data back-up plan on AWS, Amazon EBS offers persistent storage for Amazon EC2 virtual server instances. These volumes can be exposed as standard block devices, and they offer off-instance storage that persists independently from the life of an instance. To align with HIPAA guidelines, customers can create point-in-time snapshots of Amazon EBS volumes that automatically are stored in Amazon S3 and are replicated across multiple Availability Zones, which are distinct locations engineered to be insulated from failures in other Availability Zones. These snapshots can be accessed at any time and can protect data for long-term durability. Amazon S3 also provides a highly available solution for data storage and automated back-ups. By simply loading a file or image into Amazon S3, multiple redundant copies are automatically created and stored in separate data centers.

accessed at any time, from anywhere (based on permissions), and are stored until intentionally deleted.

Disaster recovery, the process of protecting an organization’s data and IT infrastructure in times of disaster, is typically one of the more expensive HIPAA requirements to comply with. This involves maintaining highly available systems, keeping both the data and system replicated off-site, and enabling continuous access to both. AWS inherently offers a variety of disaster recovery mechanisms.

With Amazon EC2, administrators can start server instances very quickly and can use an Elastic IP address (a static IP address for the cloud computing environment) for graceful failover from one machine to another. Amazon EC2 also offers Availability Zones. Administrators can launch Amazon EC2 instances in multiple Availability Zones to create geographically diverse, fault tolerant systems that are highly resilient in the event of network failures, natural disasters, and most other probable sources of downtime. Using Amazon S3, a customer’s data is replicated and automatically stored in separate data centers to provide reliable data storage designed to provide 99.99% availability.

For more information on disaster recovery, see the AWS Disaster Recovery whitepaper available at http://aws.amazon.com/disaster-recovery/.

Reduces Time. Reduces Cost. Reduces Risk

Aws allows benefit of customers to reduce their cost to become HIPAA Compliant in AWS, and it significantly reduces the time required as well by avoiding costly delays and mistakes. We understand what AWS components are not supported in a HIPAA environment, which ones are supported, and how to implement them to meet the HIPAA standards. Connectria’s staff will also assist you in getting a Business Associate Agreement (BAA) signed with Amazon, and we will enter into a BAA directly with each of our customers as well.

Connectria’s security controls and processes go far beyond AWS, and extend throughout our entire company to all of our employees. Each of our staff members are required to take and pass HIPAA Compliance certification, and we undergo an annual HIPAA HITECH Assessment by a qualified 3rd party assessor to ensure that Connectria and our employees continue to meet the HIPAA HITECH standards. Connectria provides access to our HIPAA Compliance Team at no additional cost in order to assist our customers with achieving their HIPAA Compliance objectives.

Many of our customers include Independent Software Vendors (ISVs) who serve the healthcare market and require HIPAA compliance. Some also wish to move their applications to a hosted Software as a Service (SaaS) model. Whether you are a Covered Entity, a Business Associate, or a technology provider to the healthcare market, Connectria can help you implement and manage a HIPAA Compliant environment in AWS.

Hipaa Compliant Website

  • Information that is being transported must ALWAYS be encrypted.
  • PHI is backed up and is recoverable.
  • Using unique access controls the information is only accessible by Authorized personnel.
  • The information is not tampered with or altered.
  • Information can be permanently disposed of when no longer needed.
  • Information located on a server that is secured by HIPAA security rule requirements and/or a web server company who you have a HIPAA Business Associate Agreement with.

Pentaho DB related Queries

Database host on AWS

Database Questions

1. MySQL Bulk Loader step in Pentaho: We are having issues with the Fifo file parameter. Can use this step when Spoon is installed on a Windows machine? We are running Pentaho locally and piping the data into AWS. Please see below screen capture.

ANS:- Fifo File - This is the fifo file used as a named pipe. When it does not exist it will be created with the command mkfifo and chmod 666 (this is the reason why it is not working in Windows).
Circumvention: Use the MySQL bulk loader job entry to process a whole file (suboptimal). Not supported but worth to test: mkfifo and chmod are supported by the GNU Core Utilities.

2. We received an error message relating to the 'path to the psql client' in PostgreSQL bulk loader step. How can we find and apply the path to the psql client on our amazon EC2 instance running PostgreSQL?

ANS:- First we need to define a parameter called psql_path in kettle.properties file.
E.g. psql_path=c\:/Program Files (x86)/pgAdmin III/1.16/psql.exe
Then we need to set Bulk Loader's "Path to the psql client" property, we can use ${psql_path}

3. What parameters should be set to increase data transfer speeds to a postgres database?

ANS:- Performance PostgreSQL
optimization settings of PostgreSQL, depend not only on the hardware configuration, but also on the size of the database, the number of clients and the complexity of queries, so that optimally configure the database can only be given all these parameters.
PostgreSQL settings (add/modify this settings in postgresql.conf and restart database):
  1. max_connections = 10
  2. shared_buffers = 2560MB
  3. effective_cache_size = 7680MB
  4. work_mem = 256MB
  5. maintenance_work_mem = 640MB
  6. min_wal_size = 1GB
  7. max_wal_size = 2GB
  8. checkpoint_completion_target = 0.7
  9. wal_buffers = 16MB
  10. default_statistics_target = 100

4. If there are unallowed characters for postgres text field, what is the best way to handle those; ie ASCI Null?

Ans:- solution 1:- The NULLIF function returns a null value if value1 equals value2; otherwise it returns value1. This can be used to perform the inverse operation of the COALESCE

solution2:- There are different ways to handle special characters. E.g. Escaping single quotes ' by doubling them up ->''is the standard way and works of course. E.g. 'user's log'    'user''s log'

5. We are moving data from a DB2 database to AWS. The goal is to update the data in less than 8 hours. We have nine tables and the largest table includes about 130 million rows. Is this feasible? What is the best way to implement this strategy on AWS?

Ans:- solution 1:- In the first two parts of this series we discussed two popular products--out of many possible solutions--for moving big data into the cloud: Tsunami UDP and Data Expedition’s ExpeDat S3 Gateway. Today we’ll look at another option that takes a different approach: Signiant Flight.

solution 2:- AWS Import/Export is a service you can use to transfer large amounts of data from physical storage devices into AWS. You mail your portable storage devices to AWS and AWS Import/Export transfers data directly off of your storage devices using Amazon's high-speed internal network. Your data load typically begins the next business day after your storage device arrives at AWS. After the data export or import completes, we return your storage device. For large data sets, AWS data transfer can be significantly faster than Internet transfer and more cost effective than upgrading your connectivity

solution 3:- Snowball is a petabyte-scale data transport solution that uses secure appliances to transfer large amounts of data into and out of the AWS cloud. Using Snowball addresses common challenges with large-scale data transfers including high network costs, long transfer times, and security concerns. Transferring data with Snowball is simple, fast, secure, and can be as little as one-fifth the cost of high-speed Internet.

6. What is the largest dataset (relational database table) that Pragmatic has moved to AWS? How long did it to update such a table? What performance strategies did Pragmatic undertake to achieve peak performance for updating such a table?

Ans:- If you look at typical network speeds and how long it would take to move a terabyte dataset:

Depending on the network throughput available to you and the data set size it may take rather long to move your data into Amazon S3. To help customers move their large data sets into Amazon S3 faster, we offer them the ability to do this over Amazon's internal high-speed network using AWS Import/Export.

7. What is Pragmatic suggested approach for setting up ETL architecture for an AWS based datacenter?

Ans:- With Amazon Simple Workflow (Amazon SWF), AWS Data Pipeline, and, AWS Lambda, you can build analytic solutions that are automated, repeatable, scalable, and reliable. In this post, I show you how to use these services to migrate and scale an on-premises data analytics workload.

Workflow basics

A business process can be represented as a workflow. Applications often incorporate a workflow as steps that must take place in a predefined order, with opportunities to adjust the flow of information based on certain decisions or special cases. The following is an example of an ETL workflow:

The graphic below is an overview of how SWF operates.

8. Rather than using Pentaho CE for ETL and reporting, what do you think are the advantages/disadvantages of implementing a hybrid environment running Pentaho ETL and Tableau Server? Have you implemented such a mixed environment for any of your clients?

Ans:- This can be done. Tableau does not have ETL. So we can use Pentaho ETL with Tableau. We have worked in combination with Tableau and Pentaho.You can use Pentaho for ETL & visualize data using tableau.

9. Do you have any clients in the United States that use Pragmatic support for Pentaho?

Ans:- We are a products and services company working primarily in ERP< CRM, BI and Analytics. We have worked with several customers from United States and can give you a reference for ERP deployment and report generation.

10. Do you have any clients in the United States that used Pragmatic consulting services for setting up their ETL architecture? If so, do you mind listing them as a referral?

Ans:- We have customers who have used our AWS consulting expertise not only limited to Pentaho, in the United States but in entire world in countries such as Australia,New Zealand, Switzerland, Belgium. We have also deployed scalable architectures on AWS cloud.But unfortunately most of these are companies are middle men and since we have signed NDA with them, we cannot declare their names. But we can definitely give you reference of companies in United STates with whom we have worked with other technologies such as ERP. Will that work for you?

Odoo 10 Community and Enterprise Edition Features - MRP + Maintenance+ PLM + Quality


Internet-of-Things Work Scope- intel edison

Internet of Things (IoT) has become biggest disruptive technology in world, Through this technology world reaches for greater connectivity in WAN. Nowadays every company working on AWS IoT technology. IoT services providing quick answering in one media to another media. Internet of Things and its services are becoming part of our everyday life, ways of working, and business. This is the Information and Communications Technology.

About Internet of Things?

AWS IoT can connect billions of devices and send trillions of messages, and can process and route those messages to AWS endpoints and to other devices in a reliable and secure manner. With AWS IoT, your applications can communicate with all your devices, all the time. AWS IoT makes it super easy to use AWS services such as DynamoDB, RDS, Lambda, Kinesis, S3, and Machine Learning, o build IoT applications that gather, process, analyze and act on data generated by connected devices completely in the cloud.

AWS IOT Architecture

AWS IOT Hardware Device Intel® Edison and Grove IoT Starter Kit Powered by AWS

The bundle includes the Grove IoT Environmental Kit* from Seeed Studios, a rapid-prototyping kit for designing indoor applications based on the Intel® Edison development board, and Amazon Web Services* (AWS), a suite of services that enables secure, bidirectional communications between the device and the cloud. AWS IoT* is a platform that allows devices — cars, turbines, sensor grids, light bulbs and more -- to connect to AWS services so companies can store, process, analyze, and act on the volumes of data generated by connected devices on a global scale. With a base shield that can connect up to 11 different sensors and actuators and access to AWS, you can easily create a new Internet of Things (IoT) device to explore and interact with your indoor environment. AWS services extends the functionality of the Grove Indoor Environmental Kit* for Intel Edison, adding the ability to transform , augment, or route messages to the AWS cloud with secure authentication from X.509 certificates installed on your device. You can also control how your IoT clients such as micro controllers, sensors, actuators, mobile devices, or applications connect to the AWS cloud with built-in services and SDKs to fine-tune communication, rules, and roles.

Parts List:

Board/Part Qty Documentation
Intel® Edison for Arduino 1 Read Here
Base Shield 1 Read Here
Grove - Temperature&Humidity Sensor (High-Accuracy & Mini) 1 Read Here
Grove - Moisture Sensor 1 Read Here
Grove - Light Sensor 1 Read Here
Grove - UV Sensor 1 Read Here
Grove - PIR Motion Sensor 1 Read Here
Grove - Encoder 1 Read Here
Grove - Button 1 Read Here
Grove - LCD RGB Backlight 1 Read Here
Grove - Relay 1 Read Here
Grove - Servo 1 Read Here
Grove - Buzzer 1 Read Here
USB Cable; 480mm-Black 1 -
USB Wall Power Supply 1 -

Project Scope of Work

We expect that the team works on getting all 5 sensors listed below connect using the AWS Iot Architecture shown above. Once the device is connected to AWS we should be able to capture the data inside Dynamodb. The complete project should be able to use the Device Gateway, Device Shadow with TLS authentication and MQTT protocol.. The sensors to be used are defined below

Intel® device With AWS IoT Architecture

Intel® Edison for Arduino


  • Uses a 22nm Intel® SoC that includes a dual core, dual threaded Intel® Atom™ CPU at 500MHz and a 32-bit Intel® Quark™ microcontroller at 100 MHz. It supports 40 GPIOs and includes 1GB LPDDR3, 4 GB EMMC, and dual-band WiFi and BTLE on a module slightly larger than a postage stamp.
  • The Intel Edison module will initially support development with Arduino* and C/C++, followed by Node.JS, Python, RTOS, and Visual Programming support in the near future.
  • It includes a device-to-device and device-to-cloud connectivity framework to enable cross-device communication and a cloud-based, multi-tenant, time-series analytics service.
  • Has an SD card connector, micro USB or standard sized USB host Type-A connector(via mechanical switch), Micro USB device, 6 analog inputs, and 20 ditial input/output pins, 1x UART, 1x I2C, and 1x ICSP 6-pin header (SPI) Power jack with 7V-15V DC input.


Grove - Temperature & Humidity Sensor (High-Accuracy & Mini)

This is a multifunctional sensor that gives you temperature and relative humidity information at the same time. It utilizes a TH02 sensor that can meet measurement needs of general purposes. It provides reliable readings when environment humidity condition in between 0-80% RH, and temperature condition in between 0-70°C, covering needs in most home and daily applications that don't contain extreme conditions.

Grove - Moisture Sensor

The Grove - Moisture Sensor can be used to detect the moisture of soil, to judge if there is dampness around the sensor. It can be used to decide if the plants in a garden needs watering. It can be used in gardens to automate watering plants. It can be used very easily by just inserting the sensor into the soil and reading the output using ADC.

Grove - Light Sensor

The Grove - Light sensor module uses GL5528 photoresistor(light dependent resistor) to detect the intensity of light in the environment. The resistance of photoresistor decreases when the intensity of light increases. A dual OpAmp chip LM358 on board produces voltage corresponding to intensity of light(i.e based on resistance value). The output signal from this module will be HIGH in bright light and LOW in the dark. This module can be used to build a light controlled switch i.e switch off lights during day time and switch on lights during night time.

Grove - UV Sensor

The Grove – UV Sensor is used for detecting the intensity of incident ultraviolet(UV) radiation. This form of electromagnetic radiation has shorter wavelengths than visible radiation. It is based on the sensor GUVA-S12D.It has a wide spectral range of 200nm-400nm. The module will output electrical signal which is varied with the change of the UV intensity. UV sensors are used for determining exposure to ultraviolet radiation in laboratory or environmental settings.

Grove - PIR Motion Sensor

This is a simple to use PIR motion sensor with Grove compatible interface. Simply connect it to Stem shield and program it, when anyone moves in its detecting range, the sensor outputs HIGH on its SIG pin.
The detecting range and response speed can be adjusted by 2 potentiometers soldered on its circuit board, The response speed is from 0.3s - 25s, and max 6 meters of detecting range.

AWS Device Shadow

A thing shadow (sometimes referred to as a device shadow) is a JSON document that is used to store and retrieve current state information for a thing (device, app, and so on). The Thing Shadows service maintains a thing shadow for each thing you connect to AWS IoT. You can use thing shadows to get and set the state of a thing over MQTT or HTTP, regardless of whether the thing is connected to the Internet.

Device Shadows Data Flow

The Thing Shadows services acts as an intermediary, allowing devices and applications to retrieve and update thing shadows.
The Thing Shadows service uses a number of MQTT topics to facilitate communication between applications and devices. To see how this works, use the AWS IoT MQTT client to subscribe to the following MQTT topics with QoS 1:
The Thing Shadows service sends messages to this topic when an update is successfully made to a thing shadow.
The Thing Shadows service sends messages to this topic when an update to a thing shadow is rejected.
The Thing Shadows service sends messages to this topic when a difference is detected between the reported and desired sections of a thing shadow.
The Thing Shadows service sends messages to this topic when a request for a thing shadow is made successfully.
The Thing Shadows service sends messages to this topic when a request for a thing shadow is rejected.

AWS IoT Device Gateway

The AWS IoT Device Gateway enables devices to securely and efficiently communicate with AWS IoT. The Device Gateway can exchange messages using a publication/subscription model, which enables one-to-one and one-to-many communications. With this one-to-many communication pattern AWS IoT makes it possible for a connected device to broadcast data to multiple subscribers for a given topic. The Device Gateway supports MQTT, WebSockets, and HTTP 1.1 protocols and you can easily implement support for proprietary or legacy protocols. The Device Gateway scales automatically to support over a billion devices without provisioning infrastructure.

Amazon DynamoDB

Amazon DynamoDB is a fast and flexible NoSQL database service for all applications that need consistent, single-digit millisecond latency at any scale. It is a fully managed cloud database and supports both document and key-value store models. Its flexible data model and reliable performance make it a great fit for mobile, web, gaming, ad tech, IoT, and many other applications. Start today by downloading the local version of DynamoDB, then read our Getting Started Guide.

Reference Material

Interactive tutorial
AWS IoT is a managed cloud service that lets connected devices -- cars, light bulbs, sensor grids and more -- easily and securely interact with cloud applications and other devices.
This interactive tutorial will help you to get started quickly by demonstrating the following service features:
  • Connect things to the Device Gateway
  • Process and act on data with the Rules Engine
  • Read and set device state with Device Shadows

IoT news and features

The Internet of Things is an evolving term used to describe objects and their virtual representations communicating via an internet-like network. The concept has been discussed since 1991, when the initial idea was based on control networks that would allow for the remote control and monitoring devices, inventory and factory functions. Today, the term Internet of Things relates to the advanced connectivity of devices, systems and services, going beyond merely machine-to-machine (M2M) communication. It is estimated that by 2020 there will be over 25 billion devices wirelessly connected to the Internet of Things, including embedded and wearable computing devices. We track this emerging phenomenon.

Application Scope?

Iot has used in wide application depending upon the cloud network mode, level and coverage WAN area intromission. In days every companies have provision to move on the AWS IOT . CEO Jeff Immelt said that a global network connecting people, data and machines called the Industrial Internet had the potential to add $10 to $15 trillion to global GDP in the next 20 years. GE plans to invest $1 billion in the "development of industrial internet technology and applications to make customers more productive." IoT concept is working fully automated monitoring and reporting,Utility,Plants and Animals filed in the internet, IOT applications ranking area Smart home,Smart City,Smart grids,Industrial Internet, and Connected Health (Digital health/Telehealth/Telemedicine)


Odoo – E-Bay Connector

New module released by Pragmatic Techsoft Pvt. Ltd. This facilitates user to synchronize product information from odoo to Ebay. Following are the features available in this connector

1. Provision to setup production and Sandbox ebay accounts for synctornisation.

2. Synchronization of Currencies and Countries of e-Bay

3. Provision to Synchronise product categories and specific category hierarchy from e-Bay to Odoo

4. e-Bay order synchronisation to Odoo

5. Export products from Odoo to e-Bay : capture e-bay specific information in separate tab

6. Sales Order synchronisation from e-Bay to Odoo

7. Automated work-flow for creating Invoicing and payments in Odoo


Odoo 10 Community and Enterprise Edition Features - MRP + Maintenance+ PLM + Quality

Odoo is best selling open source business suite of application that allows businesses to attain unmatched competitive advantage. The application is simple to use compared to other enterprise softwares and yet delivers highly functional for businesses across all scales and sizes. The new version Odoo 10 is expected to enhance the future readiness of enterprise applications with focus on Manufacturing.

Value Proposition of Odoo 10

The world will see launch of Odoo 10 at its yearly conference Odoo Experience 2016 on October 5th at Belgium. Slated for October 1st week release, the Odoo 10 enterprise application is sure to wow the world with its Manufacturing upgrade, enterprise grade functionalities. The entire suite is expected to be bifurcated into two versions – the Enterprise Edition and the Community Edition.

Odoo 10 is packed with some amazing Manufacturing Resource Planning (MRP).

  • PLM 
  • Bill of Materials 
  • Versioning
Manufacturing Engineering
  • Routings
  • Worksheets
  • Planning
  • Control Panel
  • Work orders
Supply Chain
  • MPS
  • Routes
  • Procurement Rules
  • Control Points
  • Checks (SPC)
  • Alerts
  • Equipment Management
  • Maintenance Requests

Split of Modules between Community and Enterprise Edition

Features in Odoo 10 Community Edition
  • bill of materials, kits, manufacturing order, routing, work orders
  • scheduling of MO (expected dates)

Features in Odoo 10 Enterprise Edition
  • mrp_floor_plant: floor plant dashboard
  • scheduling of work orders
  • work centers planning
  • statistics & dashboard
  • touchscre²en and barcode UI
  • Maintenance
  • Quality - documentation

Other Enterprise Level Apps that Deliver Incredible Outcomes to Business

Creative Website Builder CMS
  • Brand New WYSIWYG Editor
  • Website Versions
  • More Powerful Building Blocks
  • Quick Form Builder
  • Updated Customization Tools
  • Alpha Beta Testing Modes
  • Support for Multiple Websites

Secure and Scalable E-commerce
  • Customer Portal
  • Support for Digital Products such as publications, code.
  • Payment Gateway Integrations Paypal, Authorize.net
  • Built in Support for shipping companies such as FedEx, UPS, USPS< DHL
  • Ebay and Amazon Integration

Intuitive User Interface

  • Sameter Navigation Menu
  • Fully responsive with support for Mobile Web on Android, IOS
  • Slick looking Dashboard with Charts and Graphs

Accounting, Invoicing and Payments solution that delivers superior RoI
  • Improved Reconciliation
  • Smarter Financial Year Closing
  • Simplified Configuration
  • Assets
  • Send invoices to customers by email in one click
  • Integration with 24.000 Banks
  • Support for Single Euro Payments Area (SEPA) Payments
  • Import bank statements in OFX, QIF, CSV, CODA Import
  • Print Checks
  • Batch Deposit

CRM on Steroids
  • Lead, Opportunity, Account Management
  • VOIP support with One Click calls
  • Schedule Calls
  • Call center support wtth Calling Queue
  • Mass Email Marketing
  • Tracking ROI on Email Campaigns
  • Analytics
  • Support for Themes
  • Live Chat

Exclusive Retail and Restaurant Point of Sale
  • Support for managing a Restaurant
  • Loyalty Management
  • Support for Kitchen and Receipt printer
  • Hardware Support for Touch Screen Monitor, Cash Box, Bar code scanners, Customer Display and credit card reader.
  • Human Resources and Payroll
  • Project Management
  • Manufacturing Resource Planning
  • Slides with support for Docs and Videos
  • Subscription Management
  • Contract Management
  • Warehouse Management
  • Inventory Management

http://invaconsult.com.ua/ http://invaconsult.com.ua/