Retailer Big Data Analytics Intern - Retail Solutions, Inc.        
Northbrook, IL - Customer Success - Northbrook, IL - Internship
POSITION SUMMARY:

This Analytics Intern will be supporting one of the nation?s largest drug chains customized big data solutions and help drive RSi? analytical reporting and business intelligence application adoption and deliver
          When Digital Becomes Human        

There is an urgent need for an extreme transformation of the customer relationship. Customers live in a world of self service, big data, customer automation and the integration of the online & offline world. If your organization fails to implement the digital relation, your future becomes very uncertain. Succeeding in the digital transformation will not be enough. As a consequence of the digital evolution, there is also a need for the human transformation of your customer relationship. Thinking about the role of humans versus machines, thinking about the role of the warm human touch and considering the power to connect people with people, are the key challenges in this domain. ’When digital becomes human’ is a story about the combination of the digital and the human transformation in your customer strategy. This story will take you on a journey to the future. It is provocative, exciting and scary. Enjoy this amazing view on the future of marketing!
          How Energy and Utility Companies Can Make the Most of Big Data         
By Andy Dearing

E&U firms need to shift away from proprietary systems to more cloud-based and open-source tools to truly take advantage of a flood of data. 


          Preserving Transactional Data        

This paper is an adaptation of a longer report commissioned by the UK Data Service. The longer report contributes to on-going support for the Big Data Network – a programme funded by the Economic and Social Research Council (ESRC). The longer report can be found at doi:10.7207/twr16-02.

This paper discusses requirements for preserving transactional data and the accompanying challenges facing the companies and institutions who aim to re-use these data for analysis or research. It presents a range of use cases – examples of transactional data – in order to describe the characteristics and difficulties of these ‘big’ data for long-term access. Based on the overarching trends discerned in these use cases, the paper will define the challenges facing the preservation of these data early in the curation lifecycle. It will point to potential solutions within current legal and ethical frameworks, but will focus on positioning the problem of re-using these data from a preservation perspective.

In some contexts, these data could be fiscal in nature, deriving from business ‘transactions’. This paper, however, considers transactional data more broadly, addressing any data generated through interactions with a database system. Administrative data, for instance, is one important form of transactional data collected primarily for operational purposes, not for research. Examples of administrative data include information collected by government departments and other organisations when delivering a service (e.g. tax, health, or education) and can entail significant legal and ethical challenges for re-use. Transactional data, whether created by interactions between government database systems and citizens or by automatic sensors or machines, hold potential for future developments in academic research and consumer analytics. Re-use of reliable transactional data in research has the power to improve services and investments by organisations in many different sectors. Ultimately, however, these data will only lead to new discoveries and insights if they are effectively curated and preserved to ensure appropriate reproducibility. This paper explores challenges to this undertaking and approaches to ensuring long-term access.

 


          Obama’s review of the RSA        
President Obama delivered a speech at the Department of Justice to announce the outcomes of a broad-ranging and unprecedented review of U.S. intelligence programs. [Also read 1) The Fight Against Big, Bad Data 2) Big Data and the Future of Privacy] The review examined how, in light of new and changing technologies, we can use […]
          Transport services driven by data        
”Information has dramatically increased its role in society in recent years. Big data, mass data, very large data, raw data, open data, data analytics, digitisation...,” Trafi’s Information Director and Director General for Data Resources, Mia Nykopp, lists the various types of information that affect our lives. Trafi is actively involved in the 10th ITS European Congress in Helsinki, held on 16–19 June, 2014.
          Biology: Assistant Professor of Biology - University of Richmond - Richmond, VA        
We seek a biologist who has expertise in analysis of big data, modeling, bioinformatics, genomics/transcriptomics, biostatistics, or other quantitative and/or...
From University of Richmond - Thu, 06 Jul 2017 23:17:18 GMT - View all Richmond, VA jobs
          Design and ethics in the era of Big Data        
Elizabeth Goodman. May-June 2014. Design and ethics in the era of big data. interactions 21, 3 (May 2014), 22-24.
          The Three Mega Trends in Cloud and IoT        
A consequence of the Moore Nielsen prediction is the phenomenon known as Data Gravity: big data is hard to move around, much easier for the smaller applications to come to it. Consider this: it took mankind over 2000 years to produce 2 Exabytes (2×1018 bytes) of data until 2012; now we produce this much in […]
          My Top 7 Predictions for Open Source in 2014        
My 2014 predictions are finally complete.  If Open Source equals collaboration or credibility, 2013 has been nothing short of spectacular.  As an eternal optimist, I believe 2014 will be even better: Big data’s biggest play will be in meatspace, not cyberspace.  There is just so much data we produce and give away, great opportunity for […]
          Administrateur Système Banque H/F - Alten - IDF        
Rejoignez les ingénieurs de la Business Unit Banque Finance Assurance. Votre challenge ? Répondre aux enjeux majeurs du secteur : développer des services clients mobiles (banque en ligne, sites web et applications mobiles), optimiser la relation client (CRM, Big Data), réduire le délai de commercialisation (time-to-market) des nouveaux produits financiers, sécuriser les transactions électroniques ... Descriptif du projet Rattaché à une équipe d'ingénieurs infrastructures...
          What I said at Hello Culture 2014        
I went to the morning session of Hello Culture, a one-day conference discussing ‘big data’ in the context of arts and culture. I was on a panel called ‘Data – Is the Tail Wagging the Dog?’ I was given a few minutes to talk to the theme and so I put some slides together and […]
          Predicting Natural Disasters with Big Data        

Imagine living in the shadow of an active volcano. That's the reality for thousands of residents near Mexico City, as Popocatepetl looms over their city. Scientists from the USGS, UNAM and CENAPRED are using Big Data to monitor and analyze input from hundreds of live sensors in an effort to keep people safe and ready to evacuate in the event of an eruption.

Cast: Dell Multimedia

Tags: EMC


          Port of the Future        

Abu Dhabi's new state of the art port - On the leading edge of the digital
revolution.

Cast: Dell Multimedia

Tags: EMC, Abu Dhabi, UAE, Big Data, technology and shipping


          Marketing Analytics Lab - Grand Opening        

Take a behind the scenes look at EMC's new Silicon Valley Marketing Science Lab, and see how Big Data analytics can help marketers make better connections with customers.

Cast: Dell Multimedia

Tags: EMC Corp, Marketing ANalytics, Big Data, Analytics Lab, Michael Foley, Big Data Scientist, Big Data Analytics, Silicon Valley and Marketing Lab


          Preserving Astronomy's Lost Legacy (2 min PARI video)        

Located in North Carolina's Pisgah National Forest, PARI is a non-profit center for astronomical research and education using the power of Big Data to show us the universe as we've never seen it before. PARI is on a mission to digitize these two hundred thousand donated star plates into a massive online database - so they can be accessed and analyzed by researchers around the world.

Cast: Dell Multimedia

Tags: PARI, EMC Corp, Astronomy, Big Data and stars


          Rio Big Data        

Cast: Dell Multimedia


          Preserving Astronomy's Lost Legacy        

"Like" us on Facebook: on.fb.me/N2vD24
Follow us on Twitter: http://bit.ly/Ukp7aB

How large is the universe? How did the universe begin? Could there be intelligent life out there? The answers to these questions are right in front of us...and it's being stored on EMC donated storage.

Astronomers have been recording the night sky since the mid 1800's on photographic glass plates known as star plates.
These fragile plates have been hidden away in basement archives for generations...that is until now.

Located in North Carolina's Pisgah National Forest, PARI is a non-profit center for astronomical research and education using the power of Big Data to show us the universe as we've never seen it before. PARI is on a mission to digitize these two hundred thousand donated star plates into a massive online database - so they can be accessed and analyzed by researchers around the world.

Cast: Dell Multimedia

Tags: nasa, space, exploration, seti and emc


          The Human Face of Potential        

Cast: Dell Multimedia

Tags: big data, emc, diabetes, analytics, sony fs100, nikon d800, d800, desert, utah, nevada, running and sport


          Trust: The Power of Transformation (preview)        

How can Trust in action help impact the lives of millions? EMC's John Custer, former U.S. Major General revisits the Horn of Africa to explore how the power of Big Data is transforming the future for generations to come.

Stay tuned for the full documentary from EMC TV.

More from EMC TV emc.im/wWVBwx

Cast: Dell Multimedia

Tags: EMC, EMC TV, Big Data, Djibouti and Africa


          Big Data Market: Embracing Data to Transform Healthcare and Pharma Commercial Strategy - Featuring Expert Panel Views from Industry Survey 2016        

Big Data: Embracing Data to Transform Healthcare and Pharma Commercial Strategy - Featuring Expert Panel Views from Industry Survey 2016"" provides a comprehensive analysis of the Big Data landscape. GBI Research conducted an extensive industry survey of 73 experts from the pharmaceutical and healthcare industries.

Pune, Maharashtra -- (SBWIRE) -- 02/09/2017 -- Big Data Market Embracing Data to Transform Healthcare and Pharma Commercial Strategy - Featuring Expert Panel Views from Industry Survey 2016"" provides a comprehensive analysis of the Big Data Market landscape.  Report conducted an extensive industry survey of 73 experts from the pharmaceutical and healthcare industries - including both organizations that already utilize Big Data Market and those that do not. Our survey gathered experience and opinion on the use of Big Data Market, and insights on key trends for the present and future use of the technology within healthcare.

Big Data Market refers to any data set that is too large to store, process or analyze using traditional database software and hardware. It can have a significant impact on all aspects of the pharmaceutical and healthcare sector, and companies are making large investments to leverage the technology more effectively.

Browse more detail information about Big Data Market

https://www.absolutereports.com/big-data-embracing-data-to-transform-healthcare-and-pharma-commercial-strategy-featuring-expert-panel-views-from-industry-survey-2016-10529057

The report features an overview of Big Data Market and its place within healthcare. It examines the factors driving and necessitating the use of the technology within this industry, and provides detailed examples of how different Big Data Market sources and analytics techniques could be used to provide direct benefits to pharmaceutical companies, healthcare institutions and patients.

Big Data Market Scope:

- What is Big Data Market? What is its place within healthcare, and what are the main data sources?

- How prevalent is the use of Big Data Market in healthcare?

- What are the main driving factors necessitating the use of Big Data Market in healthcare? What is the relative importance of these factors according to industry?

- What are examples of the commercial benefits that the use of Big Data Market and analytics can provide, in different aspects of the industry?

- What are the main challenges associated with Big Data Market in healthcare? What is the relative importance of these factors according to industry? For the organizations that do not yet utilize Big Data Market, what specific reasons have led to their decision not to do so?

- How do major pharmaceutical and healthcare companies use Big Data Market in the real world? What are some of the main partnerships between Big Pharma and technology companies? What is the underlying technical architecture of Big Data Market in healthcare?

- What is the likelihood that organizations that already use Big Data Market will increase their investment within the next five years? Will those that do not currently invest in the technology begin doing so in the next five years?

- How can Big Data Market be effectively implemented within an organization?

Get a PDF Sample of Women health:

http://www.absolutereports.com/enquiry/request-sample/10529057

Reasons to Purchase:

Healthcare report will allow clients to have an understanding about market opportunities and competitive analysis and forecast on the women's healthcare industry. Interested clients will get a view on how therapies are developing for changing conditions and all the key factors that play together to affect or improve women's health.

Have any query? ask our expert @ http://www.absolutereports.com/enquiry/pre-order-enquiry/10529057    

Detailed TOC of Big Data Market - Assessing the Need for a Targeted and Specialized Approach

1 Big Data Market Overview 9
- What is Big Data Market? 9
- The 'Three Vs' of Big Data Market: Volume, Velocity and Variety 9
- The Sources of Big Data Market in Healthcare 10
- Big Data Market Lifecycle 12
- How Prevalent is the use Big Data Market in Healthcare? Results from our Industry-Wide Survey 13

2 Drivers of Big Data Market in Healthcare 17
- Advances in Technology: Explosion in Data Generation 17
- Next-Generation Sequencing Technologies: Outpacing Moore's Law 17
- Proteomic Databases: ProteomicsDB Designed with Big Data Market Analytics in Mind 18
- Electronic Health Records: A Form of Big Data Market 19
- Social Media: Information That Cannot Be Found Anywhere Else 19
- Devices: Smartphones, Wearables and Telemedicine Devices Represent a Continuous Source of Big Data Market 20
- Cloud Technologies: Often Integral to Big Data Market 20
- Needs and Trends Driving the Use of Big Data Market in Healthcare 21

3 Commercial Implications of Big Data Market in Healthcare 27
- Predictive Modeling: Fundamental Source of Big Data Market's Power 27
- Using Big Data Market for Patient-Specific Modeling: Potential for Huge Healthcare Savings 28
- Big Data Market Unlocks the Potential of Personalized Medicine and Targeted Therapies 28
- Utilizing the Unique Big Data Market Provided by Wearables and Fitness Trackers 29
- Big Data Market for a More Systemic Approach to Drug Repositioning 29
- Drug Discovery and Pre-Clinical Trials: Big-Data-Guided Drug Development 29

4 Appendix 63
- GBI Industry Survey: Breakdown of Respondents by General Industry 63
- GBI Industry Survey: Breakdown of Respondents by Specific Sector 63
- GBI Industry Survey: Breakdown of Respondents by Region 63
- GBI Industry Survey: Proportion of Healthcare Organizations that Currently Utilize Big Data Market 64
- GBI Industry Survey: Big Data Market Utilization in Healthcare, Comparison of Expert Panels from Europe, North America and Asia 64
- GBI Industry Survey: Most Important Factors Promoting the Use of Big Data Market in Healthcare 65
- GBI Industry Survey: Most Important Factors Promoting Big Data Market, Pharmaceutical Expert Panel vs Overall Healthcare Expert Panel 65
- GBI Industry Survey: Most Important Factors Promoting Big Data Market, Regional Breakdown 66
And Continue..

Get Discount on Big Data Market:
http://www.absolutereports.com/enquiry/request-discount/10529057

About Absolute Report
Absolute Reports is an upscale platform to help key personnel in the business world in strategizing and taking visionary decisions based on facts and figures derived from in-depth market research. We are one of the top report resellers in the market dedicated towards bringing you an ingenious concoction of data parameters.

For more information on this press release visit: http://www.sbwire.com/press-releases/big-data-market-embracing-data-to-transform-healthcare-and-pharma-commercial-strategy-featuring-expert-panel-views-from-industry-survey-2016-769494.htm

Media Relations Contact

Ameya Pingaley
Absolute Reports
Telephone: 408-520-9750
Email: Click to Email Ameya Pingaley
Web: https://www.absolutereports.com/big-data-embracing-data-to-transform-healthcare-and-pharma-commercial-strategy-featuring-expert-panel-views-from-industry-survey-2016-10529057


          Global Big Data Infrastructure Market Growth, Drivers, Trends, Demand, Share, Opportunities and Analysis to 2020        

Global Big Data Infrastructure Market 2016-2020, has been prepared based on an in-depth market analysis with inputs from industry experts. The report covers the market landscape and its growth prospects over the coming years. The report also includes a discussion of the key vendors operating in this market.

Pune, Maharashtra -- (SBWIRE) -- 02/09/2017 -- The Global Big Data Infrastructure Market Research Report covers the present scenario and the growth prospects of the Global Big Data Infrastructure Industry for 2017-2021. Global Big Data Infrastructure Market, has been prepared based on an in-depth market analysis with inputs from industry experts. The report covers the market landscape and its growth prospects over the coming years and discussion of the key vendors effective in this market.

Big data refers to a wide range of hardware, software, and services required for processing and analyzing enterprise data that is too large for traditional data processing tools to manage. In this report, we have included big data infrastructure, which includes mainly hardware and embedded software. These data are generated from various sources such as mobile devices, digital repositories, and enterprise applications, and their size ranges from terabytes to exabytes. Big data solutions have a wide range of applications such as analysis of conversations in social networking websites, fraud management in the financial services sector, and disease diagnosis in the healthcare sector.

Report analysts forecast the Global Big Data Infrastructure Warming Devices market to grow at a CAGR of 33.15% during the period 2017-2021.

Browse more detail information about Global Big Data Infrastructure Report at: https://www.absolutereports.com/global-big-data-infrastructure-market-2016-2020-10337627  

The Global Big Data Infrastructure Market Report is a meticulous investigation of current scenario of the global market, which covers several market dynamics. The Global Big Data Infrastructure market research report is a resource, which provides current as well as upcoming technical and financial details of the industry to 2021.

To calculate the market size, the report considers the revenue generated from the sales of Global Big Data Infrastructure globally.

Key Vendors of Global Big Data Infrastructure Market:
- Dell
- IBM
- HP
- Fusion-io
- NetApp
- Cisco

 

Other prominent vendors
- Intel
- Oracle
- Teradata

And many more……

 

Get a PDF Sample of Global Big Data Infrastructure Research Report at: http://www.absolutereports.com/enquiry/request-sample/10337627  

Global Big Data Infrastructure market report provides key statistics on the market status of the Global Big Data Infrastructure manufacturers and is a valuable source of guidance and direction for companies and individuals interested in the Global Big Data Infrastructure industry.

Global Big Data Infrastructure Driver:
- Benefits associated with big data
- For a full, detailed list, view our report

Global Big Data Infrastructure Challenge:
- Complexity in transformation of procured data to useful data
- For a full, detailed list, view our report

Global Big Data Infrastructure Trend:
- Increasing presence of open source big data technology platforms
- For a full, detailed list, view our report

Purchase report @ http://www.absolutereports.com/purchase/10337627  

 

Geographical Segmentation of Global Big Data Infrastructure Market:
· Global Big Data Infrastructure in Americas
· Global Big Data Infrastructure in APAC
· Global Big Data Infrastructure in EMEA

 

The Global Big Data Infrastructure report also presents the vendor landscape and a corresponding detailed analysis of the major vendors operating in the market. Global Big Data Infrastructure report analyses the market potential for each geographical region based on the growth rate, macroeconomic parameters, consumer buying patterns, and market demand and supply scenarios.

Have any query? ask our expert @ http://www.absolutereports.com/enquiry/pre-order-enquiry/10337627

Key questions answered in Global Big Data Infrastructure market report:
- What are the key trends in Global Big Data Infrastructure market?
- What are the Growth Restraints of this market?
- What will the market size & growth be in 2020?
- Who are the key manufacturer in this market space?
- What are the Global Big Data Infrastructure market opportunities, market risk and market overview?
- How revenue of this Global Big Data Infrastructure market in previous & next coming years?

Get Discount on Global Big Data Infrastructure Research Report at: http://www.absolutereports.com/enquiry/request-discount/10337627

The report then estimates 2017-2021 market development trends of Global Big Data Infrastructure market. Analysis of upstream raw materials, downstream demand, and current market dynamics is also carried out. In the end, the report makes some important proposals for a new project of Global Big Data Infrastructure market before evaluating its feasibility.

And continued….

About Absolute Report:

Absolute Reports is an upscale platform to help key personnel in the business world in strategizing and taking visionary decisions based on facts and figures derived from in depth market research. We are one of the top report resellers in the market, dedicated towards bringing you an ingenious concoction of data parameters.

For more information on this press release visit: http://www.sbwire.com/press-releases/global-big-data-infrastructure-market-growth-drivers-trends-demand-share-opportunities-and-analysis-to-2020-769211.htm

Media Relations Contact

Ameya Pingaley
Absolute Reports
Telephone: +14085209750
Email: Click to Email Ameya Pingaley
Web: https://www.absolutereports.com/global-big-data-infrastructure-market-2016-2020-10337627


          Move Big Data To The Public Cloud With An Insight PaaS        
In five years you’ll be using Insight PaaS for big data in the public cloud. On-premise won’t be an option. Here is why. Cloud Is The Hottest Market For Big Data Technology The shift to the cloud for big data is on. In fact, global spending on big data solutions via cloud subscriptions will grow […]
          DigitalGlobe’s Tony Frazier: Govt Leaders Recognize Need to Leverage Machine Learning, Big Data Analytics        
Tony Frazier, senior vice president of government solutions at DigitalGlobe, has said the U.S. government acknowledges the need to leverage big data analytics, machine learning, automation and other commercial technology platforms in order to help transform the intelligence community and global mapping efforts. Frazier wrote in a blog post published Friday that government leaders such as Robert Cardillo, director […]
          MDA to Provide RADARSAT-2 Satellite Data via DigitalGlobe’s Geospatial Big Data Platform        
DigitalGlobe and MacDonald, Dettwiler and Associates have entered an agreement to make data from MDA’s RADARSAT-2 satellite available via DigitalGlobe’s GBDX geospatial big data platform. The partnership intends to make new GBDX uses possible through the integration of optical and radar satellite data, DigitalGlobe said Monday. The RADARSAT-2 satellite works to collect synthetic aperture radar data to help users see Earth […]
          Three Keys to Advancing your Digital Transformation        

Digital assets

With today’s proliferation of data, digital transformation (DX) has become more than a hot topic: It’s an imperative for businesses of all shapes and sizes. The collision of data, analytics and technology has businesses, analysts and consumers excited — and scared — about what could happen next.

On one hand, everyone from banks to bagel shops and travel sites to tractor manufacturers have found new ways to connect the dots in their businesses while forging stronger, more dynamic customer engagement. Artificial intelligence (AI) has come of age in technologies such as smart sensors, robotic arms, and devices that can turn lights and heat on and off, adjust for changes in conditions and preferences, and even automatically reorder food and supplies for us.

However, today's Chief Analytics Officer (and Chief Data Officer and Chief Digital Officer, for example) faces both the promise and precariousness of digitizing business. While significant opportunities abound to drive revenues and customer connectivity, any leader will freely confess there are myriad technological, business and human obstacles to transforming even one element of business, introducing a new unique product or even meeting regulatory requirements.

The Big Data Dilemma

Big Data is at once the promise of the DX and its biggest roadblock. A recent Harvard Business Review article put it succinctly: “Businesses today are constantly generating enormous amounts of data, but that doesn’t always translate to actionable information.”

When 150 data scientists were asked if they had built a machine learning model, roughly one-third raised their hands. How many had deployed and/or used this model to generate value, and evaluated it? Not a single one.

This doesn’t invalidate the role of Big Data in achieving DX. To the contrary: The key to leveraging Big Data is understanding what its role is in solving your business problems, and then building strategies to make that happen — understanding, of course, that there will be missteps and possibly complete meltdowns along the way.

In fact, Big Data is just one component of DX that you need to think about. Your technology infrastructure and investments (including packaged applications, databases, and analytic and BI tools) need to similarly be rationalized and ultimately monetized, to deliver the true value they can bring to DX.

Odds are many components will either be retired or repurposed, and you’ll likely come to the same conclusion as everyone else that your business users are going to be key players in how DX technology solutions get built and used. That means your technology and analytic tools need to allow you the agility and flexibility to prototype and deploy quickly; evolve at the speed of business; and empower people across functions and lines of business to collaborate more than they’ve ever done before.

Beyond mapping out your overarching data, technology and analytic strategies, there are several areas to consider on your DX journey. Over the next three posts, I’ll focus on how to:

  1. Visualize your digital business, not your competitors’
  2. Unleash the knowledge hidden within your most critical assets
  3. Embrace the role and evolution of analytics within your journey

To whet your appetite, check out this short video on the role of AI in making DX-powered decisions.

 

The post Three Keys to Advancing your Digital Transformation appeared first on FICO.


          äººåŠ›èµ„源的大数据之道:霸道、王道、帝道        

  道”最早是由老子提出,“道可道,非常道;明可明,非常明”,其意就是要遵循自然规律。顾名思义,大数据之道,就是要找出大数据里面的规律。

人力资源的大数据之道:霸道、王道、帝道

  当前,随着大数据时代的深入发展,人们对大数据的了解正逐渐深入, 大数据主要包括以下几个方面:

  第一是数据采集,

  第二是数据存储,

  第三是数据并行计算,

  第四是大数据的分析与挖掘,

  第五是大数据的展示,

  第六是大数据隐私保护和法律问题。

  前不久,在青海西宁举办的2017年CIO大会上,职品汇创始人(原大街网首席科学家) 龚才春在演讲中说到:"在大数据的分析与挖掘中,一直没有一个通用的模型能够在任何的场景下分析出我们的数据价值。在现在是没有这样的大数据的产品的,我相信在未来的很长时间之内,也不会有这样的产品。也就是说,大数据的分析和挖掘要做成通用产品是不可能的。但是在大数据的分析与挖掘有没有共性的东西呢?我们把这个共性的东西就称为“大数据之道”。"为解释大数据之道的真正内涵,龚才春博士先后提到王道、帝道、霸道等涵义,从黄帝到尧舜,再到商鞅,再讲到如今的依法治国,无一不体现西欧那个王道到霸道的历史变迁和进化。

  在大数据的世界里,何为王道、帝道、霸道?正如龚博士所言:“大数据公司的霸道就是数字,在什么情况下你都能想到数据的时候,你可能就很短、平、快的解决你的问题,大数据的王道就是数据,所以你要积累数据、分析数据、挖掘数据,这是我们所说的大数据的王道。公司要持续发展,要行大数据的帝道,就是数学。一个问题只有在数学上解决了,这个问题才叫做从根本上解决了。所以我总结就是,大数据的霸道是数字,大数据的王道是数据,大数据的帝道是数学。”

  数字:大数据的霸道

  在大数据时代,当企业将面临的问题都归结到一个数字的时候,就可以轻易的弄清问题幷解决问题。例如,众所周知的奥斯卡奖和胡润排行榜,奥斯卡奖就是把全世界的电影,从24项里选择一个最好的,好与不好,都最终归结到一个数字商。同理,胡润排行榜也是如此,把全世界的人谁有多少钱,把这个钱变成一个数字来描述一下,给这个数字一个排序。

  当前的职品汇也是如此:把个人在职场商的表现、个人的优秀程度和可信度用一个数字来表示,把人的信用问题变成了一个简单的数字,不管这个人的人脉有多广,能力有多强,成就有多大,最好都变成了一个数字。当你把有的问题都变成了一个个数字,幷计算出来这个数字,你就会发现,在运用大数据时候,你就完成了一个很重要的工作。尤其是在职场上,需要面对各种各样的数字,而HR在看一个人的时候,只需要看这个人一个维度商的数字即可。在做大数据项目的同时,将大数据项目最终归结到几个数字,那么,你离成功就接近了一半。这就是大数据的霸道。

  数据:大数据的王道

  在大数据时代下,人们需要做一些观念上和思维方式上面的变更。需要有两个意识:第一是全样思维,在小数据年代,讲究的抽样,而大数据时代,抽样已经被淘汰,不仅不需要抽样,而是要全样,在数据采集的时候,采集的就是所有的数据,不能够对数据进行清洗;

  第二是容错思维,容错思维对应到小数据里面,就是要做数据的清洗,这些数据可能是不准确的、不精确的甚至是错误的,因此,要想办法把它去掉,这是在小数据年代常见的做法,虽然在大数据时代也是这样做,但是要转变一下方式,在清除错误数据的同时,要找出数据错误的原因和理由,错误数据的存在也有一定的道理,当你的数据在这一个场景是错误的,换一个场景可能就是正常的数据了。在大数据时代,只要是两个不同的场景,需要的数据就完全不同,因此,在大数据时代,不需要去掉任何数据,这就是所谓的容错思维。

  数学:大数据的帝道

  在大数据时代,企业或个人把面临的问题用数学模型表示出,从而解决面临的问题,这就是大数据的帝道。职品汇就是通过一系列的数学模型来判断个人简历是否真实和个人信用分等问题。如果通过建立数学模型来进行判断,龚才春博士也以中国大学的排名举例说到:“如果我要向别人证明我是中科院计算所的博士,最简单的是把论文给大家看,大家就知道我真的是中科院计算所的博士,这种属于自行提交材料,当然还有各种各样的论证方式。这边是我们在数学上计算一个人的评分,这个人的职品分905分是怎么打出来的,这是需要计算的,就有一个计算模型。这是我的个人经历,我是硕士是在山东大学上的,为什么要从山东大学到中科院去呢?我个人可能认为中科院比山东大学好一点。这些计算出来之后,就形成一个有向图,而我们手上有1.5亿份简历,中间2000多份简历中就有从一个学校到另外一个学校的。而中国的学校只有3千所,这个图是非常稠密的图,很好分析和挖掘。我们形成这么一个有向图之后,我们就在这个有向图上进行分析挖掘,这个分析挖掘的算法可以参考谷歌的算法,通过一些列对的计算就可以算出来究竟中国哪所大学是最好的,大学的排名就这么出来的。”

  中国公司的排名也是如此,中国总共有8000万家公司,到底哪家是最好的?通过数学模型就可以计算出来:中国有9.2亿从业者,这些人的信用评分可以计算出来,由此产生一个迭代的计算模型,一般而言,优秀的从业者会进入优秀的公司。此外,个人的信用状况也取决于他的朋友是什么样的朋友,也就是说个人信用状况等同于朋友信用状况的一个平均值,由此建立一个数学分析模型,通过迭代模型计算,就可以计算出谁是中国最好的公司。

  结语:

  总而言之, 不管是大数据的王道、霸道还是帝道,归根结底,都是要对大数据进行采集,挖掘、分析和整理。大数据的本质幷不在于数据本身,而且要对数据进行有效的分析和运用,从庞大的数据中找到对自身有用的数据, big data的真正涵义在“right data”。将分散的数字采集成为数据,再将整理好的数据建立数学模型进行分析运用,这就是大数据从霸道到王道,最早实现帝道的真谛所在。未来,随着大数据时代的发展,大数据必将成为企业发展必不可少的重要工具,如何实现大数据的帝道,这是广大企业要思考的问题。


人力资源的大数据之道:霸道、王道、帝道

          Zeebe brings open source order to microservice orchestration        

Looking for an open source option designed for horizontal scalability and fault-tolerence to manage your microservices? Camunda has just launched Zeebe, a big data system orchestrator, to helping you keep track of everything and anything.

The post Zeebe brings open source order to microservice orchestration appeared first on JAXenter.


          Data, Communicated        

Data is everywhere. In everything we do, in everything we see, data can be found in any place in the world that surrounds us. It is this concept that has led to the rise of new “Big Data” initiatives to try and harness all of this information. The issue with data is that if you gather too much, the signal you are trying to evaluate can get lost behind all of the noise. More importantly, perhaps, is that all of this data that is collected amounts to nothing unless the actual steps are taken to communicate this data to others, and to put it to use. At Nyaya, we have been working to get to that point of data communication.

Much of my work here this summer has been working on a new Data Communication Initiative. Through all of our programs, we collect boatloads of data, however it is often difficult to use it, and it is rarely ever communicated back to the staff who collect it. With the recent work that we have been doing, this is all changing. Through a new plan we have been working on putting into place, every month, new data will be hung up in the hospital’s new Conference and Training center, as seen in the photo below.
Bulletin Board in New Conference Room
We presented the data at a data meeting which will now become a monthly event at the hospital. Since the bulletin board has been set up, dozens of staff members have taken time out of their busy schedules to come and check out the information, in order to help inform their actions moving forward.

Nyaya’s community health workers also collect a great deal of data from the communities they work in. As shown in the photo below, for the first time, Ashma, the Associate Director of Community Health, was able to show visualized data back to the community health workers, allowing for them to finally see the fruit of their labor.

Data Being Communicated to CHWs
Among these two projects, other plans are in place to provide weekly data to community health workers, to provide the clinical staff with data to supplement their daily lectures, and to use the help of our Globemed Chapter to write actionable reports on different data points.


Data may seem like an abstract concept, but it is real, usable information that can increase the care we provide to our patients, and the strength of our public health program.  All that needs to be done is to take the time to tap into its potential. We are on the road to do just that.

          Transforming the Cleaning Business with Microsoft Technology        
ISS Netherlands wanted to improve its competitiveness in a crowded facilities services market. With the help of Microsoft technology including Windows Azure and Microsoft Office 365, the company has built a solution that automatically notifies workers when facilities need to be maintained, improving client service while reducing customer costs. We recently spoke with Martijn Jansen, Business Technology Manager at ISS Netherlands, to learn how this solution is transforming the facilities services business. Q: Please tell us about ISS Netherlands. Martijn Jansen: ISS Netherlands is part of ISS, which is a leading global provider of facility services. In the Netherlands, we offer services to hospitals, factories, government offices, and companies of all sizes. ISS Netherlands has 12,000 employees who work with a broad spectrum of clients ranging from small Dutch companies to large global enterprises. Q: What challenges were you facing that led you to build a solution using Windows Azure and Office 365? Jansen:   Facilities services is a very competitive market, especially in the area of pricing. In the past few years, the focus has primarily been on pricing, since it’s been hard to differentiate oneself in other areas. With Windows Azure and Office 365, we can easily scale up and down. This was important to us given the changing nature of our business. In addition, Windows Azure and Office 365 enabled us to innovate without a lot of risk to our company.  ISS Netherlands has a relatively small IT department. We didn’t want to make a large investment that would require us to purchase and maintain our own hardware and software.  Instead, we wanted to build a cloud-based solution that was both adaptable and flexible. Windows Azure and Office 365 met all of these needs. Q: What solution did you build? Jansen:   In a traditional cleaning environment, for example an office building, the standard proposition is that you clean each rest room twice a day, five days a week, 52 weeks a year, whether it’s needed or not. So even if no one is there, you’d still clean the toilets twice a day. We wanted to improve our efficiency by only cleaning where it was needed, and doing so right when it was needed. To do that, we created a simple “on-off” sensor that records every time a toilet, soap dispenser, or towel dispenser has been used, and then sends a message to our SQL Server database, which is run as a virtual machine within the Windows Azure cloud platform. We then created business rules so that after a specified number of visits, Windows Azure sends a message to all of the mobile phones used by the employees responsible for working in the area informing them that a particular toilet or towel dispenser needs to be serviced. Using Microsoft BizTalk Server, we send this information to our Financial Management Information System and to our customer SharePoint portals to keep track of our progress. We are using the same system to measure whether a plant needs water, whether a door has inadvertently been left opened at night, or whether a mouse trap needs to be reset. The minute the threshold is reached, we know what has to be done in a specific room of a given building. So we're no longer doing a standard routine of cleaning—only when it’s required. Q: That seems like a lot of information to track. How is the ability to process big data playing into your solution? Jansen:   The ability to cost-effectively process big data has made the solution possible. Each sensor captures multiple messages on an hourly basis, and we have thousands of sensors set up across all the facilities we manage.  Our database holds several terabytes of data, and that quantity is growing exponentially with every customer we add. None of this would have been possible five years ago. The amount of processing power that we would have needed and the amount of data that needs to be stored would have made the investment cost-prohibitive. Q: What role do Office 365 and SharePoint Online play in the solution? Jansen:   We’ve developed a customized portal for each customer using SharePoint Online, which they can view using Office Web Apps. The portal shows when we are coming to service the facility. It incorporates a map with all the inspection points, including mouse traps, toilets, towel dispensers, etcetera. It highlights all current issues as well as those that already have been fixed. And it displays the sensor data as easy-to-read bar charts searchable by topic via drop-down menus. This allows clients to get a high-level overview of what’s happening, while also reviewing the detailed data. So, for example, a manager can get a summary of pest control issues in his building, and then click through the portal to see the status of a specific room or even a specific trap.  We’ve also built a new feature into the SharePoint portal that allows customers to communicate with us. Initially, we’ll be deploying it at one of the hospitals we manage so that our client can inform us when a patient leaves and the bed needs to be made. Most Dutch hospitals have a shortage of beds, so it’s paramount that they be turned around quickly. Once the bed is made, our workers can send a message via their mobile phone back to the SharePoint portal. This will enable the hospital to put its bed space to more efficient use. Q: How has your automated solution improved customer service? Jansen:   It’s improved our customer service by enabling us to offer a higher-quality service at a lower price. By using the sensors, we’re focusing on the jobs that need to be done at that moment and eliminating unnecessary work. For example, we used to visit one of our pest control contacts every week.  But with the sensors, we now know the number of mousetraps going off without actually visiting the site.  So we can now simply visit the site when a mouse is caught rather than checking whether or not it’s happened. That’s reduced our hours and thus our cost to the customer. What’s more, customers now feel we are helping them 24/7 rather than just a few days a month. We’re also providing our customers with better information. With all the sensory data at our disposal, we can tell our clients what’s actually happening rather than using our gut feeling. Customers really like that because they now have precise data that they can present to their managers as well. Q:  What benefits has ISS Netherlands derived from the solution? Jansen:   It’s made us more competitive. By using web portals and other technology we’re on top of the league. With our automated solution, we’ve been able to serve more customers at lower cost with the same number of workers. But we’re not stopping there. Each day we’re examining how we can expand the uses of our solution to further improve our services. Using Microsoft technology, we’ve created a whole different way of managing facilities—one that’s highly efficient and provides top value for our customers. 
          Singapore Oracle Sessions - Beginnings        

Last Monday evening we had the first Singapore Oracle Sessions - an informal meetup of Oracle professionals thrown together at the last minute by a few of us.

Morten Egan (or as I believe he is called in Denmark now - The Traitor ;-)) mentioned to me months ago that if there was no user group when we arrived in Singapore, then we should start one. At the time he was the current (now retired) chairman of the Danish Oracle User Group (DOUG, strangely enough) and, as I've presented at and supported various Oracle user events over the years and am an ACE Director, it seemed fitting that we should try to build something for the Singapore Oracle community.

The fact that the Oracle ACE Hemant Chitale works for the same company and that the ACE Director Bjoern Rost would be spending a few days at my place before continuing on to the OTN APAC Tour was too much of an opportunity. After a short chat on Twitter we decided to bite the bullet and I started researching venues and contacted some of the locals. We only had 6 days to arrange it so it was either brave or stupid!

As it came together and (through a few very good contacts) we had more and more attendees registering it started to seem like a reality and eventually Bjoern, Madeleine and I found ourselves walking along to the Bugis area on Monday, hoping for the best. Despite some initial problems finding the venue, we arrived to find the extremely helpful Sean Low of Seminar Room who took excellent care of us. 

Within the matter of 15 minutes or so, 33 of the 36 or so who had registered were safely settled in their seats (including my other half Madeleine who *never* attends Oracle stuff!) for my brief introduction during which Sean insisted I try out the hand-held microphone.

My big Sinatra moment (Not)


My big Sinatra moment (not).

First up was Bjoern Rost of Portrix with "Change the way you think about tuning with SQL Plan Management" which, as those who've seen me present on the subject at Openworld, BGOUG or UKOUG would know is a subject dear to my heart. However, Bjoern seems to have had much more success with it than my failed attempts that were damned by literal values and Dynamic SQL. (I've since had a little more success, but mainly as a narrow solution to very specific problems.)

Bjoern and attentive audience

As you can see, the room was pretty full and the audience very attentive (except for a few people who appear to be mucking around with their phones!). They weren't afraid to ask some interesting and challenging questions too, which I always find very encouraging. 

Early in Bjoern's presentation we suffered what I would say was the only significant disappointment of the night as both the drinks and the pizza turned up early! It was nice of the delivery companies not to be late, but my stupid expectation that 7pm meant 7pm ensured that I was standing at the back of the room surrounded by obviously gorgeous pizza that was slowly going cold, not knowing whether I should stop Bjoern in his tracks or not. Manners dictated not (particularly as there were so many people in a small room) but the pizza experience later suggests I was wrong. Lesson learned! (Note that I had to ask others about the pizza as it's on my extensive list of things I don't eat.)

What obviously didn't go wrong at all was the social interaction between all of the attendees and speakers. It probably helped that there were a few attendees from some organisations and that people from different organisations had worked with each other in the past but it's a *long* time since I've felt such a vibrant energy during a break.

Attendees enjoying pizza and conversation

I was on next, presenting on "Real Time SQL Monitoring" and apart from a few hiccups with the clicker I borrowed from Bjoern and a couple of slide corrections I need to make, I think it went reasonably well and people seemed as enthused by SQL Mon reports as I've come to expect! With that done, and a quick smoke (I *love* organising an agenda :-)), it was time for Morten with his "Big Data Primer" 

Morten doing his thing

I think this might have been lots of peoples favourite presentation because it wasn't just about Oracle and Morten packed in plenty of the humour I've come to expect from him. Better still, it seemed to work for a quite cosmopolitan audience, so good work!

Afterwards he said a few words asking for people's feedback and whether there was a desire to setup a local user group or just continue with these informal sessions (sponsors permitting) and all of the feedback I heard later showed that people are very keen for a repeat run. 

Overall, Monday night felt like a great success. 

The passion and enthusiasm of the attendees was very encouraging and reflected in the subsequent feedback which has been consistently positive but also thoughtful so far. There's no question that a decent minority of the local Oracle community are looking for regular opportunities to hear decent speakers on subjects that interest them, meet and discuss issues with each other and also offer to present themselves, which is a great start for any Oracle User Group.

Strangely, I discovered a day or so later that there are already plans for a User Group and the Singapore launch event is next Wednesday. Coincidentally this is only 9 days after SOS! You can look into the APOUG website here and a number of colleagues and I will attend the launch event. I suppose it's a small shame that it's an APAC-wide user group, rather than specific to Singapore, which the number of attendees at such short notice would suggest Singapore can justify, but I'll be interested to see what APOUG has planned.

Big thanks for Alvin from Oracle for endless supplies of fine pizza and Bjoern Rost of Portrix Systems for the room hire (I bought the drinks, which some would say was appropriate but I couldn't possibly comment) and thanks again to all the attendees for making it a fun night!

I didn't notice until I was about to post this that Bjoern had already blogged about the evening and I think he's captured it perfectly.


          Red Hat Enterprise Linux gets cozy with MongoDB        
Easing the path for organizations to launch big data-styled services, Red Hat has coupled the 10gen MongoDB data store to its new identity management package for the Red Hat Enterprise Linux (RHEL) distribution.
          Marketing transakcyjny wyprze dotychczasowe formy promocji?        
Cyfryzacja, rozwój technologiczny, big data oraz machine learning pozwalają coraz efektywniej korzystać z dostępnych informacji i maksymalizować skuteczność komunikacji z klientem. Jednym z rozwiązań, bazujących na jakościowych danych, jest tzw. marketing transakcyjny, który według 3/4 marketingowców zastąpi dotychczasowe formy promocji, reklamy i lojalizacji klientów.
          Summer 2016 tech reading        

Hi there! Summer is here and almost gone. So here's a gigantic list of my favorite, recent articles, which I should've shared sooner.

Java

Other languages

Reactive programming

Persistent data structures

CRDT

Data

Systems and other computer science-y stuff

Fun/General

Until next time! Ashwin.

          Fall 2014 tech reading        
My posts are getting less frequent and when I do post something, I realize that they are mostly just links. Yes, work is keeping me busy.
 
Big data:
Really? Another Hadoop SQL layer? Another Storm?
For those of you who knew about the original "column oriented stores" and "in-memory stream processing" - KDB - http://queue.acm.org/detail.cfm?id=1531242

Java:
Java 8 - the good and ugly bits:
Networks and systems:
The usual Scala and Go hate:
Until next time!
          April tech reading        
Here's a bunch of stuff I found to be of some interest and relevance. Happy reading!
Java:
The old tuples and value objects conversation (again):
An Apache HTTP client "bug"/weirdness I ran into recently, which would end up consuming a large number of ephemeral ports (client side) instead or reusing connections - fix description. The ports would end up waiting in TCP_WAIT state for a long time and the client would eventually stop, unable to make any new requests.

Big data stuff. Naturally, any list is incomplete without big data: 
IntelliJ 13.1 and Git weirdness:
Random, clever tech stuff:
Until next time!
          This month's good tech reading        
(Many of these links I discovered in my Google+, Twitter, HN or RSS feeds. I don't take credit to be the first to find them)

Java:
Do I detect a "NoScala" sentiment here? 
Big data:
Curios:
Network:
Until next time!
          Java/tech stuff I found on the internet (Dec 2013 edition)        
Networking and big data:
Java/JVM perf:
Java memory model + arrays + visibility/ordering:
Curios:
Good ElasticSearch + Logstash videos:
Happy holidays!


          EPLAN Experience: Your Gateway to Efficiency        

Digital Engineering

Dear Desktop Engineering Reader: Are your workflows efficient, knowledge-driven and optimized for today’s competitive engineering challenges like increasing product complexity, Big Data and new technologies? Chances are good that a lot of your workflows and their constituent tasks are really digitized versions of your fumbling first steps toward an end that used to work back ...

The post EPLAN Experience: Your Gateway to Efficiency appeared first on Digital Engineering.


          How Facebook’s big data can transform brands        
Facebook is growing rapidly from a fun space where people share their personal life journeys into a powerful search engine, breaking news source and entertainment platform. With it’s incredible reach, marketers have naturally gravitated to the platform to reach customers in new ways.  Since the first friend request was made on the Harvard campus in […]
          Billy Beane, Baseball, and the Big Data Debate        
The real Billy Beane and Brad Pitt, who played him in 2011’s Moneyball I was the highest-rated amateur player in 1980, alongside Daryl Strawberry because I looked like a baseball player. They rated me based on all the things that got me elected homecoming king but didn’t yield returns on the baseball fields. So began […]
          Collaborative Data Management – Need of the hour!        
Well the topic may seem like a pretty old concept, yet a vital one in the age of Big Data, Mobile BI and the Hadoops! As per FIMA 2012 benchmark report Data Quality (DQ) still remains as the topmost priority in data management strategy: ‘What gets measured improves!’ But often Data Quality (DQ) initiative is Read More
          What has happened to the Skinny Data Only Combos?        
I purchased an iPad for my grandparents (both 90+) a couple years ago so they could Skype my mum who lives in the UK.

Since they don't have a landline internet connection (since they don't have a computer) I put a Skinny Data Only SIM card in their iPad and have been topping it up by 5GB ($60) every 3 months for them since they only Skype her once a week (occasionally twice) and use about an average 300MB per call (it can vary from 200-500MB).

This has been perfect for them and despite being very anti-technology have warmed to their weekly Skype calls with their daughter that they haven't seen for 15 years and who is very ill and bed ridden and therefore cannot travel to see them.

I log onto Skinny from wherever I am in the world at the time and top up and apply the 5GB Data pack to keep them in contact for the next 3 months, they don't need (or want) to know anything else about what a top up is or email or anything else related. They struggled just to work out how to use Skype and even now are scared of it however have worked out how to make a call and feel proud of themselves every time they use it.

I have just logged into their account to check on their data usage and noticed that the 5GB Data Combo is no longer available, can anyone advise what has happened to it? There seems to be no mention of it on their website any longer.

The only option I appear to see now is the Ultimate Combo (2.5GB data + other stuff that would be no use to them on an iPad only - i.e. calls & texts) however for the cost for this over 3 months would be $138 compared to the $60 it has been costing to date. There is another package below this, Big Data, however this only provides 1GB data per month where they use on average 1.7GB month with their weekly calls and overheads of app updates etc. so this would not suffice.. and if I were to tell them they need to reduce their Skype usage it would still come in more expensive on Big Data than before at $78 for 3 months.

They still have data in their account to last another couple months max however I'm a bit stuck on what I am going to do after this.

I would really appreciate to hear peoples feedback on what has happened to the Skinny Data Only combo's and also get advice on what to do in this scenario.

          Big Data Engineer        

          Big Data Analytics Intern - Bytes Universal Systems - Johannesburg, Gauteng        
Looking for Big Data Rock Stars interested in an Internship at Bytes Johannesburg. Main Purpose of the Job (In one sentence)....
From The Altech Group - Tue, 01 Aug 2017 11:27:47 GMT - View all Johannesburg, Gauteng jobs
          The Effects of Big Data in APM        

Big Data is feeling the heat. Among others, the heat of IoT and the heat of Big Data Application Performance Management. In this post, we’re going to look at how Big Data is revolutionizing how it deals with... Read More

The post The Effects of Big Data in APM appeared first on .


          A useful approach to big data        




John Klein

Big data — or the notion of it — is one of the more significant issues confronting today’s nonprofit leaders.  From large, international enterprises to single entrepreneurs, information about customer transactions, communications connections and purchase preferences exists from a broad array of sources. Some organizations have synthesized their data into meaningful insight that drives or transforms their business. Netflix, for example, gleaned so much value from its subscribers and their habits that it was willing to make an educated roll of the dice on producing original content — and changed the media landscape in the process.

However, Netflix is an exception, perhaps even a rarity: It employed technology, data and the strength of its convictions to leverage big data to its advantage. For nonprofits, technology complexity, the sheer size of the amount of data and the understandable reluctance of leaders to base their futures on this mix stops big data in its tracks. This is not a new occurrence — thousands of research reports and customer behavior overviews gather dust in office bookcases and have done so for decades. The tools behind big data just make the problem bigger.

Perhaps the time has come to apply a more useful and manageable version of big data and to focus these data and technology advances toward specific, fundamental questions nonprofit leaders ask themselves about the viability of their organizations. At face value, this might sound like a typical research project, but the value lies in the approach. The keys are:
  • •   Gaining access to a manageable base of customers and prospects 
  • •   Utilizing analytical tools that reflect today’s level of technological sophistication and ease of use 
  • •   Asking the right questions in a straight-forward manner based on standard research protocols 
  • •   Assembling the results in an easy-to-understand format, focused on action — in other words, presenting insights that can be employed to refine and grow the organization
    Why do people go to museums?

    This might sound like an ethereal question for the ages, more philosophical than practical. But the arts in the 21st century are facing systemic challenges: competition, funding, even access (through technology). Leaders of the Association of Art Museum Directors (AAMD) addressed this issue during a seminarat the Aspen Institute last spring), where they asserted that “the model of the art museum has never been more challenged and in need of creative re-imagination.” In addition, these challenges have become more vexing since there are no current comprehensive best practices guidelines for this environment — and they recognize that potential solutions may be based on each museum’s unique market and audience.

    Based on these challenges, I collaborated with an art museum to determine, based on their attitudes and motivations, why patrons and members visit. The research was based on the work of John Falk The Museum Experience, in which he suggests that visitors fall into five categories according to their motivations: curiosity, social interests, experience-seeking, art as a hobby and emotional recharging

    To apply Falk’s hypothesis to the museum, we polled through e-mail more 1,300 respondents associated with the museum and the local community. These respondents were asked a series of 15 attitudinal questions about their habits and to rank their agreement on a 1–5 scale. In addition, they were also polled for demographic information and their participation in other performing and visual arts. The data was analyzed with Qualtrics, a leading U.S. online survey research firm.

    The museum found that, beyond the art itself, that their patrons regarded a visit as a holistic immersion in an overall experience. Among the top responses, patrons felt the museum:
    • •   Was valuable because “it is an important part of the community” 
    • •   Was an integral part of their “overall interests in the arts” 
    • •   Provided “an insider’s view of art” and the artist’s creative process
    • •   Contributed to the social connectedness of friends and family
      Why did this data matter to the museum? First, it validated many of the museum’s activities, including membership, educational initiatives and special events. Second, it gained specific direction to enhance offerings of value to its patrons and prospects. Third, it developed a framework to promote holistically its unique artistic experience, reflecting the demographics and the culture of the community. All these insights were new to the museum and would not have been available without the speed and depth of analysis provided by current data gathering and technology. But most importantly, the museum took the first step to re-visioning its role as an artistic entity and important element of the local community.

      If a focused approach to big data can work for an art museum, it has possibilities in your organization as well.

      John Klein is principal of Trilithon Partners, a marketing consultant firm for nonprofits.

                Big Data Creates Big Differences in Digital Marketing Efforts        

      Each year around this time the analysts at Gartner go to work on a series of widely read reports called Hype Cycles that summarize the maturity and velocity of a staggering number of technologies. Spoiler alert: this year, we will…

      The post Big Data Creates Big Differences in Digital Marketing Efforts appeared first on Andrew Frank.


                Big Data and Marketing: Overcoming the Bias toward Bias        

      As revolutions go, the Big Data Revolution is a tough one to get really passionate about. Revolutions work best when they rally around simple, inspiring principles: freedom, equality, really handy devices….  But Big Data, which Gartner defines as “high-volume, high-velocity and…

      The post Big Data and Marketing: Overcoming the Bias toward Bias appeared first on Andrew Frank.


                B2B Marketing Predictions for 2013 by @sookieshuen        

      Change, innovation, progress, while these terms should always be associated with the positive, for marketers entrenched in their current methodologies, the future can seem down right scary as the lines blur. The ways in which consumers can now discover, consume and engage about products and services, makes it a challenge for marketing professionals to keep up.  It only makes sense that with dynamic change, some concepts fall by the wayside, while others emerge to further our goals. Here are some pertinent predictions for 2013. Is outbound marketing over? 2012 saw the continued growth of social media and the ‘Big Data’ […]

      The post B2B Marketing Predictions for 2013 by @sookieshuen appeared first on Search Engine Journal.


                AGM and privacy debate        

      CILIP North East invite you to a debate on user privacy in libraries held at the Mining Institute in Newcastle.

      The motion to be debated is:

      This house believes that protecting users' privacy in libraries should take precedence over any other demands on users' data.

       

      Debate chair: Dr Biddy Casselden
      Biddy is a Senior Lecturer at Northumbria University, in addition to being a programme leader for the MA/MSc Information and Library Management DL course. Prior to becoming an academic, Biddy worked in a variety of information management posts.

      Proposing the motion:

      • Ian Clark
        Ian Clark is a subject librarian at the University of East London and a co-founder of Voices for the Library. Involved in the Radical Librarians Collective, he is active in encouraging awareness of privacy and intellectual issues in libraries and last year wrote a paper on the relationship between surveillance and the digital divide.
      • Alex Haydock
        Alex is a recent law graduate and continuing academic, specialising in technology and internet law. He currently acts as organiser for the Open Rights Group's local North East group, and helps to organise events teaching practical digital security with CryptoParty Newcastle.

      Opposing the motion:

      • Robin Smith, Head of cybersecurity at West Yorkshire Police
      • Peter Dinsdale
        Peter is an experienced Data Protection Officer, currently working in the Information Security Team at Newcastle University. He is studying for an LLM in Information Rights Law and Practice at Northumbria University, with an interest in the legal and privacy implications of the use of big data in learning analytics in higher education.

      This debate is open to anyone interested in libraries and/or privacy.

      CILIP North East members are invited to attend the group's AGM prior to the debate starting (while everyone else gets to admire the beautiful Nicholas Wood Library or sip on their drink a bit longer!)

       

      17:15 Doors (and the upstairs bar) open

      17:45 CILIP North East AGM

      18:00 Start of the debate

      19:30 Expected finish time for the debate

      Thu, 23rd Mar 2017 - 5:15pm to 7:30pm
      Aude Charillon

      The North of England Institute of Mining and Mechanical Engineers

      Westgate Road
      Newcastle Upon Tyne
      Free Event: 

                NFV, SDN, Big Data – It's All About Automation        
      The network must be automated. And Light Reading must write about it.
                Â«Ð­Ñ‚о важнее, чем просто конкурс по машинному обучению»: сегодня в стартап-хабе Imaguru стартует первый в Беларуси дататон        

      Imaguru Datathon – это площадка для решения конкретных проблем бизнеса с помощью data science and big data techniques. Компании-партнеры – velcom и Приорбанк – ÑÑ„ормулировали задачи и предоставили свои datasets.


                Millennium Distinction awarded to M-Files Oy and Aalto University Professor Sebastiaan van Dijken        
      Technology Academy Finland rewards big data experts operating in Finland.
                The Kahala Hotel & Resort - A Success Story        

      With limited time and floods on inbound RFPs to respond to, proactive group prospecting can be hard. However, it's critical that hotels sell proactively to keep up with increased competition resulting from new inventory coming into nearly every US market. Independent hotels often have to work harder to sell their property without the backing of a brand affiliation, but big data can make proactive selling easier, more efficient, and more effective. Kerry Kuhl, the National Sales Manager at The Kahala Hotel & Resort, demonstrates how she successfully generates new group business using group data and booking patterns to develop time-saving tactics and do a lot of the heavy lifting for her.


                Lewis Baltz. Fundación Mapfre.        
      La exposición fotográfica de Lewis Baltz que se exhibe en la Fundación Mapfre hasta junio del presente año se ha concebido como una gran retrospectiva que nos permite disfrutar de la obra de este gran fotógrafo, poco visto en nuestro país.

      Lewis Baltz (Newport Beach, California,1945 - París,2014)pertenece a esa gran generación de fotógrafos americanos que concibió la fotografía como herramienta de hacer Arte, y como un arma que dispara más allá de ejercicios meramente estéticos. Entendieron la fotografía como instrumento útil para desvelar inquietudes sociales, políticas y posicionarse ideológicamente frente a la realidad.

      Lewis Baltz falleció en París, quizá en un día de aguacero, había llegado a Europa en los años ochenta huyendo de la era Reagan, cambiando su registro y su paleta desde un EEUU hoy conmocionado bajo la era Trump. Sus reflexiones acerca de las nuevas tecnologías, la industrialización desenfrenada, junto a los movimientos de la población, que conducen a la destrucción el planeta, ocuparon el núcleo central de su trabajo.

      El archivo de su obra se conserva hoy en la The Getty Research Institute




      Su forma de aproximarse a la explicación del mundo y la denuncia de sus males  fue registrar un paisaje magullado, un mundo transformado en estercolero a través de la arqueología industrial modificando y construyendo un paisaje muy diferente al romántico, sobrevolado por el ángel de Walter Benjamín. 


      Cuando la fotografía era entendida todavía como un oficio útil para registrar casas, cosas, sucesos o personas, Baltz ya empleaba la fotografía como recurso para interpretar la realidad y trabajar con el paisaje herido por el paso del hombre. No aparecen seres humanos en su fotografía, sólo su huella que construye el paisaje.


      Lewis Baltz es un artista conceptual que emplea la fotografía, de una resolución formal impecable, para  relatar la destrucción y la desolación del paisaje. 

      Su formación como fotógrafo se sitúa entre Art Institute de San Francisco y la Claremont Graduate School de California. Hace ya más de 40 años que un Lewis Baltz de 26 años comenzó a construir la coherencia de su trabajo avalado por la Galería Leo Castelli desde su primera exposición individual en 1971,Tract Houses, y sobre todo mediante su inclusión en 1975 en el grupo New Topographic.

      image of 'Tract House #22, from the portfolio The Tract Houses'
      Lewis BaltzTract House #22, from the portfolioThe Tract Houses, 1971
      Los historiadores tienden a marcar hitos para explicar de forma práctica la acumulación de hechos que van configurando nuevas realidades. La exposición New Topographics: Photographs of a Man-Altered Landscape celebrada en 1975 en el Museo Internacional de Fotografía George Eastman House (Rochester, New York) fue uno de esos momentos fundacionales que daría lugar a una fructífera generación de fotógrafos paisajistas que concibieron  el paisaje no solamente como un género, sino como medio para poder comprender el mundo que registran.

      Fotógrafos a quienes se dotó de capacidades artísticas situándolos en el mismo plano que a los escultores o que a los artistas pioneros del Land art.  En 1990, los Becher recibieron el premio de escultura en la Bienal de Venecia, que si bien no estaba aún constituida una categoría especial  para la fotografía,  pero  si no se hubiera valorado  su obra dentro del campo artístico como un híbrido entre escultura, arquitectura y fotografía, ni siquiera  habrían sido considerados para ser premiados.

      Bajo la etiqueta de New Topographics  junto a Lewis Baltz se agrupa también a Robert Adams, Bernd and Hilla Becher, Frank Gohlke, Nicholas Nixon, Stephen Shore, Joe Deal, John Schott y  Henry Wessel, jr.

      Reescribieron las reglas de los fotógrafos de paisajes conocidos hasta entonces, deteniéndose en las arqueologías industriales, en lo banal, en las ruinas y los desechos. Sus paisajes comenzaron a ser no-lugares. Espacios en los que no se detenía la mirada con detalles imperceptibles que denotan el paisaje intervenido, y que nos permiten describir el mundo tal y como se va transformando, por la acción del hombre, sin arquetipos emocionales sobre lo natural.

      Este grupo de fotógrafos que se rige bajo el espíritu de la New Topographie es descrito como carente de emoción, pero es tan solo que fulminaron la idea romántica del paisaje, por más que aún haya quien quede inevitablemente suspendido ante una puesta de sol saturada de ocres, naranjas, rojos y amarillos.

      El primer Baltz trabaja en blanco y negro, se ocupa de las relaciones culturales del hombre con el paisaje, de esos límites inestables entre lo urbano y lo rural, y como los territorios ignotos se ven alterados por la industrialización y la relación que se establece al descubrir el valor urbanizable de la tierra, que nos recuerda a esa máxima de de Richard Cantillon en el  siglo XVIII que parece seguir vigente de que la riqueza está en la tierra.

      Hay un Baltz en color, que habita ya entre dos ciudades soñadas, París y Venecia, el de la magnífica serie Sites of technology (81-91) donde se ocupa de los espacios que están alterando el ecosistema, y de la modificación del planeta a través de las nuevas tecnologías que vuelven a asentar un nuevo paradigma de interactuar con el medio.  El centro de riqueza se desplaza de la tierra y la manipulación del paisaje hacia hilos más invisibles.

      Con The Power Trilogy (1992-1995) sigue ahondando en los usos de las nuevas tecnologías como mecanismos del poder, ya sean las máquinas de vigilancia, o la vulnerabilidad ante las máquinas de la nueva medicina.  El Big data actual. El gran ojo que ni siquiera Foucault podría haber llegado a soñar.

      Al mismo tiempo la materialización de sus trabajos, desde el blanco y negro y el formato más o menos manejable, va derivando en site-specific y grandes instalaciones como Ronda de noche presentada en 1992 en el Centro George Pompidou, donde establece una clara conexión con la monumental pintura de Rembrandt (1642).


      El monumental lienzo de Rembrandt recoge el momento en que  la compañía militar del  Capitán Frans Banninck Cocq y el teniente Willem Van Ruytemburgh se prepara para iniciar la ronda nocturna por la ciudad de Ámsterdam en la que han de velar por mantener el orden. Los arcabuces, y lanzas son sustituidos por cascadas de fibras de vidrio y paneles de ventanas encendidas con luz artificial. Los rostros de los arcabuceros individualizados por el artista para ser reconocibles, son sustituidos por un único rostro robótico sin identidad.


      La exposición también incluye uno de los últimos trabajos de Lewis Balz, el maravilloso libro Venezia Marghera. El único episodio remarcable sobre la ciudad de Marghera es su proximidad a la ciudad de Venecia. Dos ciudades que agonizan, y se hunden, como la civilización que las vio nacer y que trata de registrar Lewis Baltz.



      La obra de Lewis Baltz es, en toda su complejidad, una exploración no solo de la capacidad de mostrar la realidad, sino también de percibirla. El factor narrativo, no resulta, por tanto, solo de lo que nosotros somos capaces de percibir en y sobre las imágenes, sino de la forma específica de cómo es capaz de constituir el sentido de la visión para el observador
      Carl Aigner

                The Fireside Chat: An Inside PR Co-Op with Tyler Brown        

      On the July Spin Sucks Fireside Chat, Gini Dietrich chats with Tyler Brown about digital communications, his former role at the RNC, and Sean Spicer

      The post The Fireside Chat: An Inside PR Co-Op with Tyler Brown appeared first on Spin Sucks.

              

      Related Stories

       

                Insight Engines Series A        
      My company, Insight Engines, recently announced Series A funding, to make big data easily queryable by everyone. We’re bringing natural language technology to the cybersecurity domain, so you can use plain english search queries to navigate large datasets for security investigations. If you’re also interested in the intersection between NLP and cybersecurity, we’re hiring.
                How Communicators Can Use Big Data to Inform Decisions        

      The world is the communications pro's oyster now that we have Big Data. Here we explore what you need to know, what to ask yourself, and what to analyze

      The post How Communicators Can Use Big Data to Inform Decisions appeared first on Spin Sucks.

            

                DataFlair Hiring For Freshers : 2016/2017 Pass outs : Technical Content Writer @ Indore        
      DataFlair Web Services Pvt Ltd [www.data-flair.training] Openings For Freshers : BE/ BTech/ BCA, Bsc, BCA, MCA, MSc – 2016/2017 Pass outs : Technical Content Writer @ Indore Exclusive Job For PresentJobs.com Company Profile: We are leading providers of online training on latest niche Big data technologies like Hadoop, Flink, Spark, Scala, HBase, Kafka, Storm etc. ...
                Bayesian Decision Theory        
      Alright! You probably have been hearing a lot about Big Data and Data Scientists etc. The big data craze was actually in full swing when the Harvard Business Review published an article 3 years ago with the title stated that "Data Scientist: The Sexiest Job of the 21st Century". And in order to become a […]
                Día de Big Data. Entendiendo los datos        
      Conozca las pautas más adecuadas para implementar proyectos de Big Data, gestionando de la manera más correcta la información de la que dispone una empresa a fin de alcanzar resultados y tomar decisiones en línea con la estrategia de negocio. Vea el "Día de Big Data de IDGtv. Entendiendo los datos". 24 de febrero de 2015.
                Día de Big Data. Negocio e innovación a partir del dato        
      Vea "El día del Big Data. Negocio e innovación a partir del dato", una sesión en la que se abordan las mejores prácticas para gestionar, aprovechar y medir el impacto de toda la información que rodea a la empresa.
                En directo - Forum Big Data & Analytics 2015. La información se pone al servicio del negocio        
      Retransmisión en directo del Forum Big Data & Analytics 2015, desde el Hotel Ritz en Madrid
                Could you be Automating more of your reporting?        
      Get your time back. Automation is a buzz word in the Big Data world. We consider it a key piece of most client proposals around optimization and advanced analytics. Automating the...
                James Bridle, "Big data? No thanks"        
      Big data? No thanks" James Bridle Presented at the Through Post-Atomic Eyes symposium, Toronto, 23-25 October 2015 Videography by Renée Lear, http://reneelear.com
                Apache Hadoop and Big Data on IBM Cloud        

      Companies are producing massive amounts of data—otherwise known as big data. There are many options available to manage big data and the analytics associated with it. One of the more popular options is Apache Hadoop, an open source software designed to scale up and down quickly with a high degree of fault tolerance. Hadoop lets organizations gather and examine large amounts of structured and unstructured data.


                Daily Stock Deals Initiates Profile Coverage on DSG Global (OTCQB: DSGT)        
      How a Small Big Data Company Could Help Save Thousands of Lives on American Highways and Save Millions of Dollars for the Trucking Industry SURREY, BC / ACCESSWIRE / June 7, 2016 / DSG GLOBAL INC. (DSGT), a proven industry … Continue reading
                Latinoware 2014 aí vamos nós!        

      Introdução

      Após 5 anos volto a Latinoware, evento da comunidade de Software Livre que ocorre em Foz do Iguaçu - Paraná - Brasil.

      Além das conexões pessoais, trocas de chaves pgp, desvirtualizações de amigos virtuais, chops e etc… tem uma extensa e rica programação. Assim, para minha organização pessoal, listo abaixo as palestras ou oficinas que pretendo participar. Se você estiver por lá, nestes horários, poderemos compartilhar as mesmas coordenadas de espaço-tempo :-)

      O que pretendo participar/assistir/comparecer

      A programação completa (com sinopse de cada palestra/oficina/keynote) pode ser vista aqui.

      15/10/2014

      • 10h - 11h - GNU/Linux - It is not 1984 (or 1969) anymore - Jon “Maddog” Hall
      • SIMULANDO FENÔMENOS COM O GEOGEBRA - Marcela Martins Pereira e Eduardo Antônio Soares Júnior
      • 12h - 13h - Espaços abertos colaborativos Guilherme Guerra
      • 13h - 14h - (comer alguma coisa) e tentar me dividir entre: O analfabetismo tecnológico e a formação dos professores Antonio Carlos C. Marques e Internet das Coisas: Criando APIs para o mundo real com Raspberry Pi e Python Pedro Henrique Kopper
      • 14h -16h - Abertura Oficial da Latinoware
      • 16h - 17h - Edição de vídeos na prática com kdenlive Carlos Cartola
      • 17h - 18h - red#matrix, muito mais que uma mídia social. Frederico (aracnus) Gonçalves Guimarães

      16/10/2014

      • 10h - 11h - Direitos autorais e os cuidados ao utilizar serviços “da nuvem” e “gratuitos” para construir objetos educacionais Márcio de Araújo Benedito
      • .
      • 11h - 12h - Colaboração e Ferramentas Livres: possibilidades de contra-hegemonias na Escola. Sergio F. Lima
      • 12h - 13h - Professor Livre! O uso do software livre nas licenciaturas. Wendell Bento Geraldes
      • 13h - 14h - (comer alguma coisa) e Padrões abertos de documentação - ODF. Fa Conti
      • 14h -15h - Bitcoin, o futuro do dinheiro é open source (e livre). Kemel Zaidan e Plataforma Open Hardware para Robótica. Thalis Antunes De Souza e Thomás Antunes de Souza
      • 15h - 16h - Mozilla e Educação, como estamos revolucionando o ensino de habilidades digitais. Marcus Saad
      • 16h - 17h - Arduino Uno x MSP 430. Raphael Pereira AlkmimYuri Adan Gonçalves Cordovil
      • 17h - 18h - Inclusão de PCDs na Educação - Com Software Livre é Possível. Marcos Silva Vieira

      17/10/2014

      • 10h - 12h - Presença digital: não basta estar lá, tem que participar. Frederico (aracnus) Gonçalves Guimarães Será um “mão na massa” :-)
      • .
      • 12h - 13h - Acho que vou almoçar :-)
      • 13h - 14h - Educação e tecnologia com recursos livres. Marcos Egito
      • 14h -14:15h - Foto oficial do Evento
      • 14:15h - 15:15h - Introdução ao Latex. Ole Peter Smith
      • 15:15h - 16:15h - abnTeX2 e LaTeX: normas “absurdas” e documentos elegantes. Lauro César
      • 16:15h - 17:15h - Data Science / Big Data / Machine Learning E Software Livre. Eduardo Maçan

      Se você estiver por lá, faça contato!


                Vamos Conversar sobre Privacidade?        

      A não ser que você tenha ficado com sua cabeça escondida embaixo de uma pedra, você provavelmente já sabe que o governo dos Estados Unidos é agora conhecido por ter sabotado por completo a privacidade das comunicações online.

      Não apenas um pouquinho. Eles são, agora, parceiros em cada bit no “Big Data” com a aprovação do Google e Facebook. Talvez pior, pois ele (o governo dos Estados Unidos) supostamente em breve será capaz de armazenar cada pedaço de dado que já passou por toda a internet - para sempre, e teoricamente poderá quebrar qualquer criptografia. (…)

      Continue lendo essa excelente tradução aqui: http://amigos.tl1n.com/display/abinoam/120927. O texto original, em inglês, pode ser lido aqui: http://psfl.in/-d

      Meus movimentos práticos sobre essa conversa: Valorizando meu tempo on-line


                Leveraging ‘Big Data’ - Methodological Considerations in Health Services Research, New Webinar Hosted by Xtalks        

      The objective of this webinar is to review types of datasets commonly used in this field, discuss potential pros and cons of each type depending on research question, and identify other key considerations. To gain insights on this topic, join the live webinar broadcast taking place on Thursday, August 21, 2014 at 2pm EDT.

      (PRWeb July 30, 2014)

      Read the full story at http://www.prweb.com/releases/2014/07/prweb12056130.htm


                What will historical records say about Mind Mapping?        
      Historical records are fascinating aren’t they? I’m not the historical type, but my wife is, and she’s the one who professionally works within the family tree arena. I kind of look at these things with a glazed look to my eyes, and it’s very evident I’m rather uninterested in historical time-lines.

      But late last year my wife sat me down (yes I was coerced) and she systematically showed me the structure of my family name tree. For the first time I had a very relaxed feeling as my wife took me through the past four centuries of her detective work relating to my family name.

      It was fascinating and I eventually got it. I appreciated her work and was so interested in her system of operation while she whittled out the false avenues she often went down eventually arriving at a proven source and then moving on to establish a new historical avenue to investigate. The potentials and probabilities are so captivating. It is true detective work.

      And this got me thinking about historical time-lines and their significance.

      I asked myself, and now you the reader: what will historical records say? Of course as always I am speaking directly about the visual mapping arena.

      For the past 50 years or so we’ve been introduced to the formalizing of hand drawn Mind mapping by the one and only Tony Buzan. And of course we’ve since then witnessed and experienced an exponential expansion of the original thought, method and evolved expressions of the original methodology.

      The constant though is: It’s all based on Buzan Mind mapping, no denying that at all. And I can’t help bend the knee of respect to Tony Buzanfor formalizing what was named Mind mapping into a structure that has indeed changed the lives of numerous adherents to this fascinating synaptic tool.

      And for those who would suggest Mind mapping has been around for millennium; well they’re right as “Historical records” do prove a form of mind mapping has been around throughout human history.

      However: one man, and that is Tony Buzan formalized hand drawn Mind mapping, so there we have it.

      What will the historical records say?

      The works of Roy Grubb are in my opinion such an important historical record relating to all things graphical data, information and knowledge mapping. Go sift through his awesome work and soak in the information and knowledge Roy presents at his domain.

      Mind mapping became Visual mapping by virtue of the inclusion of multiple graphical formats being added to the original radiant Buzan approach.

      And: Visual mapping has by virtue of this digital age morphed (even evolved) into knowledge mapping. And of course it is evolving continuously along with technology.

      It does seem we are at an epoch of great advancement, even to the point of the inclusion of enhanced reality and Artificial Intelligence being added to the mix that seems to be taking the original format down the rabbit hole of mind bending possibilities. What those possibilities are? We can only surmise at this time, but suffice to say it may indeed be mind bending.

      So: Historical records are important, and as we are exponentially pushing forward into uncharted territory with the help of big data, AI and augmented reality systems, we must needs establish, support and enhance a true Historical record of where this arena started (or was rebooted) from. Let’s make sure we’ve got an accurate point of reference for an historical record.

      What do you believe the historical records shall say about where this arena had its actual genesis and where it has been and where it’s going.

      The work of Tony Buzan, Roy Grubb, John England, Chuck Freyand developers such as CS Odessa, Mindjet, iMindMap, Xmind and organizations such as Biggerplate and the product/service that underpins the digital mind evolution of this part of the 21st century TheBrain: they’re all part of this historical mosaic. There’s many other contributing knowledge architect thought leaders: see HERE.

      History is indeed in reality all about where we are going, with the knowledge of where we have been; is it not?

                Part Deux: The feedback cometh. Young G Chung: CEO, SimTech Systems, Inc        
      In answer to: The Evolution of the Visual Mapper (Part Deux)?

      As CEO of SimTech Systems Inc I'd like to share my thoughts with you.

      Yes we've been the developer of MindMapper and Thinkwise for the past 20 years. And regarding the question of visual thinking arena's past, present, and future?

      Visual thinking is one of the thinking attributes born to all human beings. However, there are different methods and degrees of visual thinking, but in all, everyone, whether they know it or not, thinks visually.

      Visual mapping software is an excellent tool that helps users to think creatively and logically utilizing the tree structure as a visual backbone. However, we must realize that visual mapping is just one method out of many that facilitate visual thinking.

      Let's look at the advancement in the visual mapping arena from the software perspective only. Computer generated mind map was available in the mid-1990s. Digital mind map market really opened up when the market received with enthusiasm MindMapper'skey feature of converting a map to an MS Word document for the first time at Comdex 2001.

      Before and after 2010, we witnessed increased expansion of creative thinking at the team level by utilizing collaboration. And currently, individual and teams are using mobile and the cloud for visual mapping needs.

      The biggest contribution visual mapping has made is that visual diagramming skills only available to professionals were now accessible to many to help simplify and expedite their daily activities. In short, it has fundamentally contributed to achieving happiness and success by utilizing visual and whole brain thinking capabilities given to all human beings.

      A lot of people are still puzzled and want to know why the visual mapping software industry has not solidified itself as mainstream. But, one must ask if such question or the expectation is valid and beneficial.

      A lot of people cannot distinguish the difference between visual mapping and mind mapping. Visual thinking and visual tools had been around even before mind map existed and they are not limited to a tree structure visualization.

      Visual mapping products, when they came to market, emphasized main benefits and positive effects inherent only to mind mapping. As a result, mind mapping became a buzzword to the general public equating it to visual thinking.

      Many visual mapping software developers from 2000 to 2005 have used mind map for their marketing efforts. However, as time passed, many have realized that mind mapping software is nothing more than information visualized in a tree structure and started to lose interest in the whole arena. On top of that, it is tough to replicate the original mind map's memory benefits from a digitally created map. Ironically, mind map has contributed to the rise of the visual mapping market, however, in the process, it has contributed to the loss of visual mapping identity and definition.

      Salt is used in almost all food. However, salt is never presented as food itself. Visual thinking is God-given natural talent that everyone uses in their daily activities. But, mind mapping technique or tree-based visual mapping is just a small part of visual mapping arena. We all know of a visual mapping developer who positioned itself as a project management tool in the early 2000s as a way to distinguish itself from the rest of the industry but had failed to emerge as a viable project management tool.

      Mind mapping or any tree-based visual mapping software can work together along with groupware, ERP, and such, to create synergy, but they can never become a substitute or replacement. From this perspective, I foresee future products integrated with the mainstream products or evolve to combine technology such as DB, Big Data, and AI with visualization using visual mapping.


      MindMapper also has contributed to the growth of visual mapping arena for the past 20 years. "Value creation through innovation" is MindMapper's direction for the next stage of development. To create value, you need to ideate and execute. Our goal is to evolve into a more comprehensive tool that can facilitate idea to action. And our current iteration, MindMapper16 is the first product development process accomplishing this objective.

      Thank you;

      Young G Chung
      CEO, SimTech Systems, Inc.
      Developer of MindMapper
                Data purge algorithm: Efficiently delete terabytes of data from DB2 for Linux, UNIX, and Windows        
      Big data introduces data storage and system performance challenges. Keeping your growing tables small and efficient improves system performance as the smaller tables and indices are accessed faster; all other things being equal, a small database performs better than a large one. While traditional data purge techniques work well for smaller databases, they fail as the database size scales up into a few terabytes. This tutorial will discuss an algorithm to efficiently delete terabytes of data from the DB2 database.
                Big Data Is Becoming The Norm        
      Satellites and GPS technology are tools that have reached big industry. Take railroad for example. One piece of technology will replace 500 employees when indicating a hot box detector. Bad for employees good for stockholders. Is this beneficial for society? Only time will tell. A new farming trend is precision farming. Goodbye to conventional farming? […]
                Data-driven crime prediction fails to erase human bias        

      Poor, minority communities flagged as drug crime trouble spots in case study

      Science & the Public
      photo of police car

      BIG DATA DOESN’T PAY  Software programs that use police records to predict crime hot spots may result in police unfairly targeting low-income and minority communities, a new study shows.

      Big data is everywhere these days and police departments are no exception. As law enforcement agencies are tasked with doing more with less, many are using predictive policing tools. These tools feed various data into algorithms to flag people likely to be involved with future crimes or to predict where crimes will occur.

      In the years since Time magazine named predictive policing as one of 2011’s best 50 inventions of the year, its popularity has grown. Twenty U.S. cities, including Chicago, Atlanta, Los Angeles and Seattle are using a predictive policing system, and several more are considering it. But with the uptick in use has come a growing chorus of caution. Community activists, civil rights groups and even some skeptical police chiefs have raised concerns that predictive data approaches may unfairly target some groups of people more than others.

      New research by statistician Kristian Lum provides a telling case study. Lum, who leads the policing project at the San Francisco-based Human Rights Data Analysis Group, looked at how the crime-mapping program PredPol would perform if put to use in Oakland, Calif. PredPol, which purports to “eliminate profiling concerns,” takes data on crime type, location and time and feeds it into a machine-learning algorithm. The algorithm, originally based on predicting seismic activity after an earthquake, trains itself with the police crime data and then predicts where future crimes will occur.

      Lum was interested in bias in the crime data — not political or racial bias, just the ordinary statistical kind. While this bias knows no color or socioeconomic class, Lum and her HRDAG colleague William Isaac demonstrate that it can lead to policing that unfairly targets minorities and those living in poorer neighborhoods.

      By applying the algorithm to 2010 data on drug crime reports for Oakland, the researchers generated a predicted rate of drug crime on a map of the city for every day of 2011. The researchers then compared the data used by the algorithm — drug use documented by the police — with a record of overall drug use, whether recorded or not. This ground-truthing came from taking public health data from the 2011 National Survey on Drug Use and Health and demographic data from the city of Oakland to derive an estimate of drug use for all city residents.

      Story continues below maps

      Wheredunit

      Drug use in Oakland is probably fairly widespread (left) based on estimates derived in part from the 2011 National Survey on Drug Use and Health. But police records of drug reports and crimes are concentrated in areas that are largely nonwhite and low-income (right).

      graphs of drug use and crime reports in Oakland

      In this public health-based map, drug use is widely distributed across the city. In the predicted drug crime map, it is not. Instead, drug use deemed worthy of police attention is concentrated in neighborhoods in West Oakland and along International Boulevard, two predominately low-income and nonwhite areas.

      Predictive policing approaches are often touted as eliminating concerns about police profiling. But rather than correcting bias, the predictive model exacerbated it, Lum said during a panel on data and crime at the American Association for the Advancement of Science annual meeting in Boston in February. While estimates of drug use are pretty even across race, the algorithm would direct Oakland police to locations that would target black people at roughly twice the rate of whites. A similar disparity emerges when analyzing by income group: Poorer neighborhoods get targeted.

      Shifting target

      While drug use estimated from public health data is roughly equivalent across racial classifications (top), police using a predictive policing algorithm in Oakland, Calif., would target black people at roughly twice the rate of whites (bottom).

      And a troubling feedback loop emerges when police are sent to targeted locations. If police find slightly more crime in an area because that’s where they’re concentrating patrols, these crimes become part of the dataset that directs where further patrolling should occur. Bias becomes amplified, hot spots hotter.

      There’s nothing wrong with PredPol’s algorithm, Lum notes. Machine learning algorithms learn patterns and structure in data. “The algorithm did exactly what we asked; it learned patterns in the data,” she says. The danger is in thinking that predictive policing will tell you about patterns in the occurrence of crime. It’s really telling you about patterns in police records.

      Police aren’t tasked with collecting random samples, nor should they be, says Lum. And that’s all the more reason why departments should be transparent and vigilant about how they use their data. In some ways, PredPol-guided policing isn’t so different from old-fashioned pins on a map.

      For her part, Lum would prefer that police stick to these timeworn approaches. With pins on a map, the what, why and where of the data are very clear. The black box of an algorithm, on the other hand, lends undue legitimacy to the police targeting certain locations while simultaneously removing accountability. “There’s a move toward thinking machine learning is our savior,” says Lum. “You hear people say, “A computer can’t be racist.’”

      The use of predictive policing may be costly, both literally and figuratively. The software programs can run from $20,000 to up to $100,000 per year for larger cities. It’s harder to put numbers on the human cost of over-policing, but the toll is real. Increased police scrutiny can lead to poor mental health outcomes for residents and undermine relationships between police and the communities they serve. Big data doesn’t help when it’s bad data.


                NC: Report Looks at Charlotte's Sustainability Trends         
      Sustainability

       

      Aug. 05--Charlotte trails national averages on transportation and land use patterns while showing improvement in energy use and other local measures, says a first-of-its-kind sustainability report card released Tuesday.
      The nonprofit group Sustain Charlotte examined data trends in nine categories to produce the report, which is aimed to help local governments set goals and create policies.
      "We're living in a time when more and more we're making decisions using big data," said Shannon Binns, Sustain Charlotte's executive director. "It seems important to have an understanding of whether we're making progress on these issues."
      The report assigns two grades for each category, the first measuring local trends and the second a comparison to national averages.
      The county's best grade was for water usage, in which Mecklenburg was given a B when compared to the nation as well as a B for its own usage trend.
      Since the 2007 drought, water usage has declined significantly, which is one factor in a number of rate increases by Charlotte Mecklenburg Utilities.
      Mecklenburg fared worse on land use and transportation, getting Ds in both when compared to the nation. The trend line for transportation was better, with a B.
      The report said the Charlotte metro area has been ranked as the fifth most sprawling area in the nation, and that the amount of land used for parks, on a per capita basis, has decreased since 2007.
      In terms of transportation, Mecklenburg received high marks for an increase in people using public transportation to get to work (2.5 percent in 2000 to 3.8 percent in 2011), as well as the construction of the Lynx Blue Line.
      But the report card noted that the area still trails national averages in terms of people who commute by biking, walking or taking public transportation.
      Charlotte is building a $1.1 billion extension of the Lynx Blue Line to University City. But the Charlotte Area Transit System doesn't have enough money left to build other large rail projects.
      Among the trends the report details: The number of local families and children living in poverty doubled between 2000 and 2011. Transportation costs are taking larger chunks of personal income. Sixty neighborhoods are "food deserts." Sprawling land development continues.
      "Overall what this report shows is that there are very few areas in which we are making dramatic strides forward and outshining the national averages," Binns said.
      Charlotte City Council member John Autry, who chairs the city's environmental committee, said the region can improve.
      "Are we a leader (in these areas)? Not today," Autry said.
      He said the region could make significant improvements, including a "pay as you throw" program in which residents pay for how much garbage they throw away.
      "That would have a significant impact," he said.
      County manager Dena Diorio said Mecklenburg's goal is to have a park within a 5- to 10-minute walk of all residents.
      The report is aimed at local decision-makers, but the group hopes to also influence individual choices. It's also intended to serve as baseline data for residents involved in the Mecklenburg Livable Communities Plan, which will develop community goals.
      Sustain Charlotte makes recommendations for each category, with some drawn from sources such as Mecklenburg County's biennial State of the Environment report.
      The Z. Smith Reynolds Foundation provides operating grants to Sustain Charlotte. The Davidson College Sustainability Scholars program provided intern Jordan Luebkemann, who helped compile the report's data. The report's other authors include Sustain board member Jennifer Fairchild, staff member Meg Fencil and Binns.
      Copyright 2014 - The Charlotte Observer

                IDG Contributor Network: From big data to good data: closing the gap between data governance and business insights        

      With the expansion of the digital gold rush, data is moving into the spotlight and becoming a valuable source of information. Estimates are that the digital universe will continue to double every two years at least and reach 44 zettabytes by 2020, 50-fold growth compared to 2010. The sheer size of the data lake is staggering, but the million-dollar question remains; that is, how to make sense out of the data tsunami and capitalize on it.

      Storage costs keep plummeting

      The phenomenon referred to as Moore’s law has been observed for decades, and with the emergence of new technology (SSD, SW-defined storage, object storage, etc.) as well as the consolidation within the storage industry, the price spirals keep heading south with a double-digit decline year-on-year.

      To read this article in full or to leave a comment, please click here


                Comment on EPI 052 | Making informed property decisions with David from Property Planning Australia by Kaz        
      Hi Paul Thanks for your comments and a great suggestion regarding property data. It's a topic I have on my 'potential episodes' list and it's a tricky one too - as even amongst the 'stats experts' they interpret things and calculate things differently! I'm currently trialling a few different data providers and so have some contacts at the big data providers as a result of this - so I'll see what I can tee up! Cheers, Kaz
                Taiwan’s largest fully- functioned ICT research institute, joins 2016 Selangor Smart City Conference to offer “So Fashion” a big data analytics solution for Omni-Channel Merchandising        
      KUALA LUMPUR, 6 December 2016: According to market research Market Intelligence & Consulting Institute (MIC) in Taiwan, market for big data information related applications has already exceeded 45 billion US dollars in 2016. The market has grown tremendously in the last few years and big data information processing will become a main driving force for [...]
                Taiwan Goes South with 7 Smart ICT Innovations        
      KUALA LUMPUR, August 9 – The overwhelming impact of the development of cloud, big data analysis, mobile and social networking applications, etc. has led to global wide digital transformation. A visiting delegation of seven (7) leading Taiwanese ICT companies and a big group of Malaysian ICT companies, including local telecommunication company of great renown, governmental [...]
                Economic and Geo-Political Prognosis for 2015        

      Paper No. 5856                                 Dated 12-Jan-2015

      Guest Column by Dr. Rajesh Tembarai Krishnamachari and Srividya Kannan Ramachandran

      Abstract:

      The re-moderation of the world economy set in place over the past few years continues apace. Notwithstanding some lasting damage on the supply side through the 2008 recessionary trough, our outlook for 2015 is bullish weighing more on optimistic data trends than on continued negative sentiment proffered from some analyst quarters.

      Around the world in 80 (or more) words:

      Treating the ten-year US Treasury bond yield as a proxy indicator for that nation's nominal GDP growth, we anticipate United States to grow around 3% next year.[1] While this does not mark a return to the buoyant 90s, it is better than the secular stagnation hypothesized earlier in 2014.[2] With US acting as an engine to spur growth, the world economy should also expand by more than 3%.[3] Stability across the world will be maintained – as sparks without a concomitant fury will characterize both overt (e.g. Russia-West over Ukraine) and covert (e.g. China-Japan over Senkaku) animosities.[4] European stagnation from debt and unemployment will be counterbalanced through quantitative easing by the European Central Bank.[5] Similar action in Japan will display the limits of Abe-nomics.[6] China will prepare for a structural slowdown emphasizing domestic consumption and de-leveraging an over-heated financial sector; all the while growing at a 7% rate that will amaze rivals around the world.[7] Indian reform, even if inadequate, will boost the middle classes and reinforce confidence in the Modi government.[8] African countries will find their commodity boom dissipate and ease of borrowing decline as commodity prices fall and yields rise in the developed world.[9]

      Continental tectonics:

      a. North America:

      Economic benefits arising from the exploitation of shale gas have not only silenced the anti-fracking environmentalists, they have altered the strategic world-view of Washington politicians.[10] As US aims to overtake even Saudi Arabia in oil/NGL production in 2015 (and the Saudis pull out all stops in preventing it by driving crude prices down), it has markedly reduced its role as a global policeman.[11] Its own economy is on the mend even as a lame-duck president will be boggled down with partisan grid-lock. Markets will fret about the mid-year (or earlier?) hike in interest rates; though Main Street - aided by a strong dollar - will likely shrug it off with a continued upward movement across different sectors.[12]

      Mexico and Canada will benefit from their tight coupling with the United States.[13] Enrique Pena Nieto will claim credit for reforming the Mexican economy – across sectors as diverse as energy and telecom.[14] Pemex, dear to the Mexicans, will face some competition, though nothing remotely similar to the American acquisition of Tim Hortons – dear to the Canadians – will happen.[15] Up north, the Canadian elections in 2015 will reveal whether the country has reverted to its liberal propensities or sticks with Harper's conservative agenda.[16]

      b. Latin and South America:

      The outlook is disappointing across much of the region. Run-away inflation hammers Argentina and Venezuela; milder ill-effects bedevil Brazil, Bolivia and Uruguay.[17] The Maduro regime in Venezuela and the Kirchner government in Argentina continue to flirt with disaster as their GDP growths slip and mass discontent builds up.[18] Dilma Rousseff has stabilized her position electorally, though her policies continue to disappoint investors and have the potential to reignite sudden protests like the 2013 bus-fare protests.[19] Dependence on commodity exports in a time of declining prices does not portend well for any of the South American states, including Brazil.[20] On a positive note, Cuba – already expected by analysts to grow by close to 4% next year – will see a boost to its fortunes accruing from a thaw in relations with US under Obama.[21]

      c. Africa:

      African nations had a great run in the past few years. This arose not only from the boom in commodity prices but also from the need for yield amongst DM (developed market) investors resulting in investment in both corporate and public African bonds.[22] In 2015, these factors could dissipate which will place pressure on countries like Angola where household spending has risen more than 4000% since the start of the millennium.[23] Ethiopia and Kenya are expected to continue on a robust growth path.[24] Contradictions abound within Africa, and nowhere are they more visible than in Nigeria. While the northern part struggles under the oppression of Boko Haram, the southern part booms under Goodluck Jonathan's president-ship.[25] In neighboring South Sudan, one is reminded of the risk-reward payoff as the nation widely tipped to experience spectacular growth in 2014, got mired in conflict, with the consequent dissipation of growth potential.[26]

      American intervention in Libya undermined the Gaddafi-imposed order and has led to a civil war between the Islamist and secularist factions which will hold back that nation in the coming year.[27] A more benign intervention was that of the French in Mali in 2013; we expect more calls for Hollande's assistance in 2015.[28] El Sisi has stabilized Egypt after the Muslim Brotherhood interlude in the post-Mubarak era. Though more brutal than Mubarak, the El Sisi regime is being propped by both the Americans and Saudis, leading us to expect the recent bull run in Egyptian markets to continue.[29] ANC rule in South Africa continues unimpeded. Though atrophied by many scandals, the rule should produce close to 3% growth in the coming year.[30]

      d. Middle East:

      The region continues to be a cesspool of ethno-sectarian rivalries as the century-old Sykes-Pikot agreement unravels.[31] Recep Erdogan has stabilized Turkey and should reap a growth on par with other emerging economies.[32] Erdogan's external actions driven by AKP's crypto-desire to establish a caliphate will see him prop the Islamic State (IS) just so that it can damage Shia and Kurdish interests; but not enough to threaten his own Sunni hegemonic plans.[33] The Saudi establishment has focused on the removal of the Muslim brotherhood threat; now they will focus on limiting Shia Iranian influence by keeping crude prices low.[34] Western companies made a beeline to Iran in 2014 in hope of an impending thaw; much will depend on the negotiation ability of the Rouhani establishment on the sanction front.[35] Dubai and Israel remain insulated from the turmoil around and could reap the benefit of the uptick in the world economy.[36] The risk of sudden flare-ups like the 2014 Gaza war continue to remain on the Israeli radar.

      e. Asia and Australia:

      The Asian political scene is remarkably stable with China, Japan and India looking inward to stabilize their economies under the leadership of Xi Jinping, Shinzo Abe and Narendra Modi, respectively. Some events have gone unnoticed by world media – for example, China starts the year of the goat as the world's largest economy when measured in PPP terms and for the first time ever, Chinese outbound investments could exceed those inbound.[37] The establishment of China on the world stage has made Xi stronger than any Chinese leader in recent memory bar Chairman Mao himself. The Abe regime will continue on its reformist route of bringing Japan out of the deflationary zone, while winking at nationalist sentiment calling for a re-interpretation of the country's post-war pacifist role.[38] Down south in India, Modi has surprised both supporters and detractors alike by his middle-path approach to reforming the economy and his zealous interest in foreign policy. While reforming cautiously, he has not removed the populist schemes of the previous government. 2015 will see him act unimpeded by local elections (other than in Bihar) and will prove to be a litmus test of his claims of good governance.[39]

      Afghanistan under Ashraf Ghani will face more trouble from Taliban as US adopts the Pakistani classification into good versus bad Taliban.[40] In nearby Pakistan, the wildly popular Imran Khan - with some help, perhaps, from the deep state – will challenge the established parties in their home turfs.[41] In Indonesia, Jake Widodo has come to power with Imran Khan-type support amongst the youth, and he will be hard-pressed to implement his reformist agenda – including reducing fuel subsidies – amidst persistent opposition from entrenched interests.[42] ASEAN will continue to slip on its stated intentions for closer cooperation.[43] Australia will try to balance its strategic partnership with the United States with economic dalliances with the Chinese.[44]

      f. Europe and Russia:

      Vladimir Putin will be emboldened by the short-term rise in domestic popularity; and hence ignore the longer-term implications of his intervention in Ukraine.[45] Tighter coupling with Kazakhstan and Belarus will not prevent what is likely to be a low-growth and high-inflation year for the Russians.[46] Europe as a whole continues to underperform, and it will be most visible in France and Italy both of whom might record less than 1% growth in GDP. With the Trierweller-Gayet saga behind his back, Francois Hollande will attempt to rein in a deficit running at close to 4% of GDP. Even with help from ECB's quantitative easing program, there is little expectation that Hollande can avoid being the most unpopular leader amongst all western democracies.[47] In Italy, high debt and unemployment – exemplified by the statistic of four-fifths of Italians between the ages of 20-31 living with parents – will hamper any efforts Matteo Renzi might take to pull the economy out of its doldrums.[48]

      The Greeks might look forward to a better year, especially when juxtaposed against their recent past. On the back of painful reforms, the Greek economy is widely anticipated to commence its long journey back to health, though there might be recurrent political scares and recalcitrant rumors of a Greek exit.[49] The German government will be buffeted by opposing demands – external calls for a more interventionist role in stabilizing the world economy and internal ones for tempering the same. Cautious progress on the fiscal front will lead to modest GDP growth.[50] Ironically, the European nations with best GDP growth projections are also the ones with the highest exposure to Putin's misadventures, viz. Poland, Latvia and Lithuania.[51]

      Sectors and segments:

      Having dropped significantly in the past few months, the level of oil prices affects the prospects for many industry sectors in 2015.  Oil is typically expected to revert to the mean because a lower oil price has discernible impact on both supply (by discouraging investment in its production and distribution) and demand (by boosting economic activity) sides.[52] The speed of such mean-reversion remains unclear. Russia, Iran and US shale producers (esp. those who are not based at strategic locations) suffer disproportionally more than the Saudi establishment at current price levels.[53] Lower oil prices will provide a fillip to consumer discretionary industries and airlines; and have an adverse impact on railroad (benefiting from oil transportation) and petrochemical companies. The shale gas boom - apart from increasing housing activity - is also the prime driver behind growth in the US steel and construction material sectors; consequently both the steel and construction sectors will remain susceptible to crude movements.[54]

      Low interest rates and low macro-growth prospects will induce companies with excess cash to acquire other companies to report earnings growth. That trend will be apparent in companies transacting in sectors as diverse as healthcare, industrials, semiconductors, software and materials.[55] On another side of investment banks, trading desks will see higher market volatility as major powers pursue divergent paths to monetary policy (e.g. US against EU/Japan).[56] In US, regulatory obligations increasing cost of capital for holding certain securities might lead to decreased broker liquidity.[57] 2015 shall see the big banks grapple with the regulations in Basel III and Volcker; one expects regulatory push towards vanilla deposit-taking and lending to continue.[58] Analysts will hope that stronger balance sheets coupled with a return to profitability lead to increased dividend payout for investors in financial stocks. China will seek to tame its overheated financial sector amidst a structural slowdown[59], and India will see RBI governor Raghuram Rajan continue his battle against political interference in corporate lending.[60] Wealth management services will perform remarkably well not only in China, but also to a lesser extent in US as a rising market creates wealth and a retiring baby-boomer crowd seeks to couple low risk with acceptable return.[61] In the arena of mobile payment, Apple Pay will try to avoid the lackluster performance of earlier attempts like Google Wallet.[62]

      Lower gasoline prices and an accompanying increase in disposable income (through wealth creation at the markets, increased home values, reduced unemployment and improved economic activity) creates a positive outlook for the consumer discretionary sector. Companies dealing with organic farming benefit from increased health consciousness; the market for yoga will continue to rise as 2014 saw the UN declare a world yoga day on Modi's initiative.[63] Even as DVDs and Blue-rays fall, digital film subscriptions and on-demand internet steaming will rise to please Hollywood.[64] Bollywood will get over its obsession with INR 100 crore revenues as movies will cross that level more frequently.[65]  With supply level of hotels remaining the same as few years back, revenue per room will rise across the sector.[66] Tighter access to credit continues to hamper the rise in existing house sales, which nevertheless should improve over the past year.[67] Asian apparel manufacturers continue to improve their market share in the fast fashion market.[68]  October 2015 will see Europeans benefit from the eCall service in all their new cars, which allows a car to immediately report details to the base-stations on any accident. New carbon-emission standards also come into force in Europe; even elsewhere the move towards higher efficiency in cars will continue.[69] Widodo will be pleased at the growth in automobile sales in Indonesia, which should exceed those of other major markets.[70] Internet advertising is rising faster than television commercials, though 2015 will still see the latter dominate the former in overall revenue generated.[71] Privacy concerns continue to erode on the social media front.[72] The newspaper industry will see increased number of advertorials re-packaged as "native advertising" by which companies will pay for advertisements to be written as paid newspaper article.[73]

      In India, the BJP government is yet to clarify its position on foreign direct investment in retail.[74] Irrespective of its final decision, retail sales should surge sharply upward there as the consummation of pent-up demand of past few years couples with the thriving of 'mall culture' in middle-tier cities. China will also see an increase in retail sales inspite of its investigation in to WalMart.[75] The anti-corruption campaign though will negatively impact luxury good sales as well as those of higher-end automobiles there[76]. A strong dollar will affect US companies with significant operations abroad. Wheat production might match 2014 record volumes in Europe[77]; though more newsprint will probably be devoted to higher prices of cocoa from Ivory Coast.[78] Idiosyncrasies of local markets will shine as Dubai invests in large-scale brick and mortal malls, while Manhattan gets more of its groceries delivered at home steps.[79]

      Demand for energy should rise at the same pace as the world GDP next year. Analysts will point at attractive valuations of oil companies.[80] If shale price remains attractive, Sabine Pass in Louisiana will emerge as the first plant in US to export LNG.[81] Four years after the Fukushima incident, Japan will see nuclear reactors back in operation at Sendai.[82]

      2014 saw the denizens of the developed world fret about Ebola, breast cancer (through a campaign by actor Angelina Jolie) and ALS (through the ice bucket challenge).[83] Overall, health spending will comfortably outpace the rate of growth of the overall economy. Long-term secular trends driving this are the aging population in the western world (with the population pyramid replaced by a population dome) and an emerging middle class elsewhere with increasing demand for improved access to healthcare.[84] Universal healthcare has been promised for all in India, which should drive up healthcare expenditure by a significant amount there.[85] In 2015, large US companies are mandated under Obama-care to provide insurance to more than 70% of their eligible workforce.[86] Uncertainty on US healthcare reform and debate thereon may cause short-term price volatility. Millennial Development Goals will reviewed by the UN later in the year with a new set of goalposts announced for countries to be met by 2030; different NGOs will campaign vigorously through media to get their pet agendas included in the final list.[87]

      Transportation companies will report higher earnings from increased economic activity.[88] Apart from some airlines which have suffered reputation damage through recurring accidents, airline companies will benefit from the reduced oil prices. Defense industry will see robust growth in China, as "Chi-America" remains no more a chimera.[89] Alarmed by this increase, Vietnam with Philippines will move within the US ambit and Australia will seek to join the tripartite naval exercises in the Indian Ocean between US, Japan and India.[90] Tensions in Eastern Europe and the middle-east will favor increases in expenditure across the region. The nationalist government in India will increase defense expenditure sharply even as it moves beyond lip-service on the long-standing issue of indigenization of defense manufacturing.[91]

      The mantra of social-local-mobile (SoLoMo in tech jargon) continues to drive the consumer markets division of information technology companies.[92] Expenditure on IT hardware is significantly retarded by the increasing move to cloud computing.[93] The move to cloud computing - along with increasing use of mobile commerce - bodes well for the computer security business.[94] India should see a sharp increase in smart phone adoption; elsewhere tablet computers will rise against laptop and desktops.[95] Embedded systems coupled with rudimentary networking will be marketed as an all-encompassing internet of things as the era of big data continues.[96]  Today, a single family in US places more demands on data flow than the entire planet did a decade back; and even this data rate is expected to increase by a whopping 70% over the next year. Consolidation in the cable sector (e.g Comcast with Time Warner Cable) and the convergence of content with distribution (e.g. AT&T with DirectTV) are two trends that should continue on from 2014.[97] Even as Indians will talk about 3G coverage spanning the nation; Americans will tweet about 4G price warfare and the Chinese will see ZTE unveil a 5G prototype.[98] Facebook will have more users than China has human beings.[99] Analysts will harp about impact of interest-rate hikes on high dividend paying telecom stocks.[100] Apart from the financial industry, telecom will emerge as an industry most impacted by federal regulation across the globe.

      The anthropologist Edward Weyer once compared the future to being akin to a "corridor into which we can see only through the light coming from behind".  It is in that sense that we have analyzed the data of the bygone year and tried to extrapolate into the days and months ahead. And when some are falsified - and falsified, some will be - then we shall lay credit for the same at the feet of those responsible - viz. us, the people.

      [The authors are based in New York City, and can be contacted through email at tkrajesh@gmail.com and srivi019@gmail.com. The views represented above are personal and do not in any manner reflect those of the institutions affiliated with the authors.]

      References


      [1] See the graph titled "10 year bond yield: annual change and real GDP: annual % change" at http://www.swcollege.com/bef/econ_data/bond_yield/bond_yield_data.html.

      [2] "Secular stagnation: facts, causes and cures", a VoxEU eBook at  http://www.voxeu.org/sites/default/files/Vox_secular_stagnation.pdf.

      [4] A brief historical perspective on the Russia-Ukraine conflict is at http://www.summer.harvard.edu/blog-news-events/conflict-ukraine-historical-perspective.

      The Economist magazine summarizes the debate over Senkaku islands at http://www.economist.com/blogs/economist-explains/2013/12/economist-explains-1.

      [5] “The ECB, demigods and eurozone quantitative easing” at http://www.ft.com/cms/s/0/c90dd466-7bb4-11e4-a695-00144feabdc0.html#axzz3NIKpG2Fx.

      [6] “Bank of Japan announces more quantitative easing: the next chapter in Abenomics” at http://www.forbes.com/sites/jonhartley/2014/11/02/bank-of-japan-announces-more-quantitative-easing-the-next-chapter-in-abenomics/.

      [7] “World Bank urges China to cut economic growth target to seven percent in 2015, focus on reforms” at http://www.reuters.com/article/2014/10/29/us-china-worldbank-idUSKBN0II05P20141029.

      [8] “Reforms by PM Narendra Modi will help India to grow 5.5% this year, 6.3% next year: ADB” at http://articles.economictimes.indiatimes.com/2014-12-17/news/57154602_1_cent-the-adb-growth-forecast.

      [10] “The experts: how the US oil boom will change the markets and geopolitics”, http://www.wsj.com/articles/SB10001424127887324105204578382690249436084

      [13] “Economic growth patterns in USA, Canada, Mexico and China” at http://www.huffingtonpost.com/dominik-knoll/economic-growth-patterns-_b_5832182.html.

      [14] “Mexican president Pena Nieto's ratings slip with economic reform” at http://www.pewglobal.org/2014/08/26/mexican-president-pena-nietos-ratings-slip-with-economic-reform/.

      [17] “Andres Oppenheimer: Latin America's forecast for 2015: not good” at http://www.miamiherald.com/news/local/news-columns-blogs/andres-oppenheimer/article2503660.html.

      [18] “Maduro blames plunging oil prices on US war vs Russia, Venezuela” at http://www.reuters.com/article/2014/12/30/us-venezuela-oil-idUSKBN0K802020141230 and “What's in store for post-Kirchner Argentina” at http://globalriskinsights.com/2014/12/whats-store-post-kirchner-argentina/

      [19] “Brazil economists cut 2015 growth forecast to slowest on record” at http://www.bloomberg.com/news/2014-08-11/brazil-economists-cut-2015-growth-forecast-to-slowest-on-record.html

      [20] “Economic snapshot for Latin America” at http://www.focus-economics.com/regions/latin-america.

      [21] “Cuba, Dominican Republic and Puerto Rico business forecast report Q1 2015” at http://www.marketresearch.com/Business-Monitor-International-v304/Cuba-Dominican-Republic-Puerto-Rico-8538079/ and “Obama's Cuba move is Florida's top story for 2014” at http://www.washingtontimes.com/news/2014/dec/29/obamas-cuba-move-is-floridas-top-story-of-2014/.

      [24] “Ethiopia overview” at  http://www.worldbank.org/en/country/ethiopia/overview and “Kenya overview” at http://www.worldbank.org/en/country/kenya/overview.

      [26] “Internal violence in South Sudan” at http://www.cfr.org/global/global-conflict-tracker/p32137#!/?marker=33.

      [27] “Political instability in Libya” at http://www.cfr.org/global/global-conflict-tracker/p32137#!/?marker=14.

      [28] “The regional impact of the armed conflict and French intervention in Mali” at http://www.peacebuilding.no/var/ezflow_site/storage/original/application/f18726c3338e39049bd4d554d4a22c36.pdf.

      [29] “EGX head optimistic on equities as Egyptian economy recovers” at http://www.thenational.ae/business/markets/egx-head-optimistic-on-equities-as-egyptian-economy-recovers.

      [30] “Economy - outlook for 2015 dismal, despite boost” at http://mg.co.za/article/2014-11-25-economy-outlook-for-2015-not-encouraging-despite-boost.

      [31] “Pre-state Israel: The Sykes-Picot agreement” at http://www.jewishvirtuallibrary.org/jsource/History/sykes_pico.html.

      [32] “Turkey - economic forecast summary (Nov 2014)” at http://www.oecd.org/economy/turkey-economic-forecast-summary.htm.

      [34] “Saudi-Iranian relations since the fall of Saddam” at http://www.rand.org/pubs/monographs/MG840.html.

      [36] “Dubai 2015 cross sector business outlook extremely bullish” at http://ameinfo.com/blog/mentors/c/capital-club/dubai-2015-cross-sector-business-outlook-extremely-bullish/ and “Israel - economic forecast summary (Nov 2014)” at http://www.oecd.org/economy/israel-economic-forecast-summary.htm.

      [37] “China's leap forward: overtaking the US as world's biggest economy” at http://blogs.ft.com/ftdata/2014/10/08/chinas-leap-forward-overtaking-the-us-as-worlds-biggest-economy/.

      [38] “Understanding Shinzo Abe and Japanese nationalism” at http://www.foreignpolicyjournal.com/2014/05/26/understanding-shinzo-abe-and-japanese-nationalism/.

      [39] Book: “Getting India back on track: an action agenda for reform” edited by B. Debroy, A. J. Tellis and R. Trevor.

      [40] “US may not target Mullah Omar after this year" at http://www.dawn.com/news/1152382.

      [41] “The rise and rise of Kaptaan” at http://tribune.com.pk/story/800722/the-rise-and-rise-of-kaptaan/.

      [42] “Widodo launches reform agenda with fuel price hike” at http://www.focus-economics.com/news/indonesia/fiscal/widodo-launches-reform-agenda-fuel-price-hike.

      [43] “ASEAN's elusive integration” at http://opinion.inquirer.net/74164/aseans-elusive-integration.

      [46] “Russia's economics ministry downgrades 2015 oil price forecast to $80 per barrel” at http://itar-tass.com/en/economy/764662.

      [47] “Hollande popularity plumbs new low in mid-term French poll” at http://www.reuters.com/article/2014/11/06/us-france-hollande-idUSKBN0IQ14R20141106.

                Rant: would love some "big data, small cost" discussions        
      When I read the blogs and articles about Big Data, the subtext is always Big Money. Even though many of the tools themselves are open-source, they all seem to require Big Infrastructure and a horde of lavishly-paid consultants and brainiacs to get deployed. It's hard to read a "big data" story about anywhere where people appeared to pay attention to how much things cost.

      This is rather frustrating. While it's interesting in a sort of Hollywood-gossip way to read about how Megacorp or a multibillion-dollar "startup" deployed a zillion-rack Hadoop server farm, cooled by Icelandic glaciers, and baby-sat by a thousand people all over the world, in order to mine fascinating new ways to better separate customers from their cash, it doesn't help me much here in reality-land. Here, I'm "the guy", and we have some money to play with, but not an enormous amount - and we watch every penny.

      Fortunately, I have a few tricks up my sleeve, and I'd love to learn more. But I wish the database world would lose its fascination with big-data porn and have some real life examples of how people are solving big-data problems with real-life budgets and personnel constraints.
                Big Data: "Telemetry" schemas        
      Many database schemas have similar characteristics, and one common - and important - schema type is what I call the "telemetry" schema. Telemetry schemas have certain things in common:

      1. They have a small number of relatively simple tables. I'll call these the "telemetry log tables".
      2. They have a time component, and are often queried using the time component.
      3. They have at least one score or amount associated with the data, which is also queried frequently.
      4. Telemetry log tables are often very large, with hundreds of millions or billions of records.
      5. While there may be summary tables that are updated frequently, records in the telemetry log tables are rarely or never updated.

      Examples of telemetry schemas include:

      1. Actual telemetry data from various types of sensors.
      2. Wall street-style trading data.
      3. Other transactional data, such as banking activity, point-of-sale activity, etc.

      As mentioned above, telemetry schemas often have a time series component, but also need to be queried in interesting ways using approaches other than the simple time component.

      Telemetry schemas pose several challenges to standard schema design strategies:

      1. Telemetry log tables are typically too large for relational joins, even when well-indexed, to perform well, particularly on "bulk" searches that visit many records.
      2. Most database engines will "anchor" a search on one index, use it to fetch records out of the base table, and finish qualifying the search using values from these records, even if other indexes exist on the qualifying rows. This search strategy will perform awfully with huge tables, unless the engine gets lucky and picks a highly selective index, or the search happens to be on a very selective value in the index. Statistics-based query optimizers can help some, but can still "whiff badly", and this can result in what I call the database "death penalty" for queries: a badly-optimized "loser" query that takes days to run.

      In a future blog post, I'll talk about a search strategy I've successfully used for a large telemetry schemas.
                PRIMARY KEYs in INNODB: Choose wisely        
      PRIMARY KEYs in InnoDB are the primary structure used to organize data in a table. This means the choice of the PRIMARY KEY has a direct impact on performance. And for big datasets, this performance choice can be enormous.

      Consider a table with a primary search attribute such as "CITY", a secondary search attribute "RANK", and a third search attribute "DATE".

      A simple "traditional" approach to this table would be something like

      create table myinfo (city varchar(50),
      rank float,
      info_date timestamp,
      id bigint,
      primary key (id)
      ) engine=innodb;


      create index lookup_index
      on myinfo (city, rank, info_date);


      InnoDB builds the primary table data store in a B-tree structure around "id", as it's the primary key. The index "index_lookup" contains index records for every record in the table, and the primary key of the record is stored as the "lookup key" for the index.

      This may look OK at first glance, and will perform decently with up to a few million records. But consider how lookups on myinfo by a query like

      select * from myinfo where city = 'San Jose' and rank between 5 and 10 and date > '2011-02-15';

      are answered by MySQL:

      1. First, the index B-tree is walked to find the records of interest in the index structure itself.
      2. Now, for every record of interest, the entire "primary" B-tree is walked to fetch the actual record values.

      This means that N+1 B-trees are walked for N result records.

      Now consider the following change to the above table:

      create table myinfo (city varchar(50),
      rank float,
      info_date timestamp,
      id bigint,
      primary key (city, rank, info_date, id)
      ) engine=innodb;

      create index id_lookup on myinfo (id);

      The primary key is now a four-column primary key, and since "id" is distinct, it satisfies the uniqueness requirements for primary keys. The above query now only has to walk a single B-tree to be completely answered. Note also that searches against CITY alone or CITY+RANK also benefit.

      Let's plug in some numbers, and put 100M records into myinfo. Let's also say that an average search returns 5,000 records.

      Schema 1: (Index lookup + Primary Key lookup from index):
      Lg (100M) * 1 + 5000 * Lg (100M) = 132903 B-tree operations.

      Schema 2: (Primary Key lookup only):
      Lg(100M) * 1 = 26 B-tree operations. (Note that this single B-tree ingress operation will fetch 5K records)

      So, for this query, Schema 2 is over 5,000 times faster than Schema 1. So, if Schema 2 is answered in a second, Schema 1 will take nearly two hours.

      Note that we've played a bit of a trick here, and now lookups on "ID" are relatively expensive. But there are many situations where a table identifier is rarely or never looked up, but used as the primary key as "InnoDB needs a primary key".

      See also Schema Design Matters

                Development in a large-data environment        
      One of the biggest problems with application development in the context of "Big Data" is that the developer gets the database-interacting code to "work" in the developer's "playpen" database, but the code collapses when it's put into production. A related, but even more serious problem - since it won't be found as quickly - is code that works early, but has very bad degradation as the production database grows.

      There's a couple approaches to this problem:

      1. Have the typical "small database" for initial coding and debugging.
      2. Have a large developer playpen database. It should be a large fraction of the size of the production database, or if the production database is small, it should contain contrived (but logically consistent) data that is a large fraction of the expected size of the production database.

      Developers should unit-test against both databases before committing their changes.
                Sirius Computer Solutions to Acquire Brightlight Consulting        

      Sirius strengthens its big data and business analytics portfolio

      (PRWeb October 24, 2014)

      Read the full story at http://www.prweb.com/releases/2014/10/prweb12275535.htm


                Big Data, Big Deal?        
      More pieces of data have been produced in the last five years than in all of human history put together before then. But what's driving this big data revelation? We'll discover what opportunities it opens up, and we'll uncover the pitfalls we might be facing. Plus, news that scientists uncover the first water on Earth, and we talk to the team who raced a solar powered car 3,000 kilometres across Australia...
                Big Data in 2014        

      Big Data Analytics Trends for 2014-Infographic

      An infographic from Aureus about the top trends in big data, analytics and analytical models that are defining trends of analytics today and are likely to have a significant influence in 2014 and future course around the globe.

      This article Big Data in 2014 was first published on iNFOGRAPHiCsMANiA.


                Make it Digital!        
      This week, broadcasting live from the centre of Cambridge, the Naked Scientists delve into the digital age we live in. We look at new, exciting ways to get kids into coding, how big data is changing the world of healthcare, and we take to skies to go drone racing. But what are the problems we face in this technological age? We find out who is using our online data, and explore the dangers of connecting to public Wi-Fi...
                Erlang User Conference: Building Massively Scalable Fault-Tolerant Systems – 25% Off        
      On 13-14 June, Stockholm will be the best place in Europe to discuss Multi-core, Big Data, Cloud, Embedded, NoSQL, Mobile and the Future of the Web. The Erlang User Conference 2013 features over 40 speakers including top experts such as the inventors of Erlang Mike Williams, Robert Virding and Joe Armstrong, the author of Yaws [...]
                Kostenloses eStrategy-Magazin in neuer Ausgabe erschienen        
      Ab sofort steht Ausgabe 02/2013 des eStrategy-Magazins unter www.estrategy-magazin.de wieder kostenlos zum Download bereit. Wie in den vergangenen Ausgaben konnten wir vom eStrategy-Team auch in der aktuellen Ausgabe des Magazins wieder jede Menge Fachwissen rund um E-Commerce und Online-Marketing auf über 120 Seiten bündeln.

      Der Schwerpunkt in Ausgabe 02/2013 liegt im Bereich Big Data. Hierzu liefert u.a. das Fraunhofer-Institut für Intelligente Analyse- und Informationssysteme IAIS einen umfassenden Artikel, welcher die Potentiale und Möglichkeiten von Big Data beschreibt und analysiert.

      Ergänzt wird das Ganze durch viele weitere spannende Themen aus der Onlinewelt: Neben Big Data als Hauptthema konnte ein Expertenartikel aus dem Hause eBay gewonnen werden, der sich mit dem Thema „Commerce Revolution“ befasst. Außerdem führte die eStrategy-Redaktion ein Experteninterview mit Zappos.com, das einen Blick hinter die Kulissen des erfolgreichsten Online-Schuhhändlers ermöglicht. Zudem wird in der neuen Ausgabe das Trendthema „Sharing Economy“ genauer unter die Lupe genommen und Spezialisten geben ihr Know-How zu aktuellen Themen rund um Google Shopping, Future Commerce, Usability, etc. preis.


         


      Die Themen der aktuellen eStrategy-Ausgabe im Überblick:

      E-Commerce
      • Big Data und Customer Journey – Träume werden wahr… 
      • In Deutschland wird das Potenzial von Webshops nach wie vor verkannt. Geht nicht gibt’s nicht: Wie Hersteller und Händler nicht nur in der Nische punkten können 
      • Sharing Economy – das neue Zeitalter des Social Webs 
      • Tante Emma 2.0 oder Big-Data 
      • Emotional Usability – mit Herz und Verstand erfolgreich im E-Commerce 
      • Big Data – Vorsprung durch Wissen. Mit neuen Technologien zum datengestützten „Tante-Emma-Laden“ 
      • eBay: Commerce Revolution 
      • Big Data – Modebegriff oder Trend? 
      • Zappos.com – Ein Blick hinter die Kulissen eines der erfolgreichsten Online-Händlers 

      Online Marketing
      • Google Shopping – Produktdaten als kritischer Erfolgsfaktor 
      • Online Marketing Intelligence (OMI) 
      • Google AdWords 2.0 – Erweiterte Kampagnen und deren Möglichkeiten 
      • Wie man mit guten Daten SEO steuert

      Recht
      • Rechtliche Tücken des Mobile Advertising 
      • Big Data – zwischen Urheberrecht und Datenschutzrecht 
      • Open Source Software: Rechtliche Grundlagen sowie Chancen & Risiken für Unternehmen

      Die aktuelle Ausgabe des eStrategy-Magazins wird auch dieses mal wieder mit Buchempfehlungen sowie spannenden Surftipps abgerundet.

      Für wen ist das Magazin gedacht?
      Das Magazin wird überwiegend von Shop- und Website-Betreibern, Agenturen, Unternehmensberatungen sowie IT- und Marketing-Verantwortlichen gelesen und ist natürlich auch für alle anderen, die an den Themenbereichen E-Commerce, Online-Marketing, Webentwicklung und Mobile interessiert sind, gedacht.

      Ausblick
      Die Ausgabe 03 / 2013 wird am 10. September erscheinen. Die Themenplanung für die kommende Ausgabe läuft bereits und Gastautoren können sich gerne mit Themenvorschlägen unter info@estrategy-magazin.de an die eStrategy-Redaktion wenden. Zudem freuen wir uns über Feedback und Kontaktanfragen für Partnerschaften.

      Hier geht's zum Download...


                7 Key Takeaways Europe’s Customer Festival 2013 - CustFest        

      We went to London for the Europe’s Customer Festival and we brought back 7 takeaways around the themes of the conference: Loyalty (understand, engage and retain your customers), Big Data (gain insights by understanding behavior), Omni Channel (create a seamless experience online and offline) and Total Payments (the moment of payment is critical to a customer experience strategy).
                ÐšÐ°Ð¿Ð¸Ñ‚ал очевидность        
      Тренды, за которыми продолжит идти Галицкий в 2014 году,— "облачные" и Ð¼Ð¾Ð±Ð¸Ð»ÑŒÐ½Ñ‹Ðµ технологии и big data. Еще одно перспективное направление лежит в Ð¾Ð±Ð»Ð°ÑÑ‚и сращивания реальной жизни с Ð¸Ð½Ñ‚ернетом. "Всем кажется, что с Ð¸Ð½Ñ‚ернетом и Ð˜Ð¢ все уже ясно, все на поверхности. Ничего подобного. Все только началось,— подчеркивает Галицкий.— Даже печка в Ð¿ÐµÐºÐ°Ñ€Ð½Ðµ будет управляться компьютером, а Ð²ÑÐµ, что до Ð¿ÐµÑ‡ÐºÐ¸: приход муки, воды и Ð´Ð¾ÑÑ‚авка по магазинам, станет происходить через интернет. Если добавить сюда знание массивов больших данных и Ð³Ñ€Ð°Ð¼Ð¾Ñ‚но распорядиться имеющейся информацией, можно вырастить эффективную компанию в Ð»ÑŽÐ±Ð¾Ð¹ сфере".

      http://kommersant.ru/doc/2342344
                Society for Information Technology and Teacher Education Event Highlights Big Data        

      RANDA Solutions announces participation in a panel at this year's SITE 26th Annual Conference in Las Vegas featuring an emphasis on big data as three experts in education information technology present "Research on Implementing Big Data". The panel surveys technologies, processes and change management dynamics of implementing big data initiatives.

      (PRWeb February 09, 2015)

      Read the full story at http://www.prweb.com/releases/2015/02/prweb12501411.htm


                What is Big Data? From Bytes to Petabytes        
      This post is the first of a series of posts related to Big Data, since I thought it was worth going in-depth with this topic. Big Data is a big word, a big buzzword, some might even call it big bullshit, since many components revolving around Big Data, and especially the ones on the analytics/methodology ...
                Gigamon lance un outil de métadonnées pour l'analyse de sécurité des big data        
      Secuobs.com : 2016-04-14 16:00:58 - Global Security Mag Online - Afin de répondre aux exigences actuelles des approches analytiques de sécurité du marché, Gigamon ajoute un outil de métadonnées à sa plateforme GigaSECURE Security Delivery Platform SDP pour améliorer les capacités de génération NetFlow IPFIX, faisant ainsi de la plateforme Gigamon l'unique source de contexte du réseau Les processus d'analyse traditionnels effectués sur les paquets et logs, devenus plus chronophages et couteux avec la croissance des volumes et des vitesses de transferts, ont favorisé - Produits
                Les datacenters à l'heure du Big Data        
      Secuobs.com : 2016-04-14 08:57:20 - Global Security Mag Online - Confrontés aux enjeux posés par le Big Data, les datacenters doivent faire un saut technologique Une bonne nouvelle pour leurs clients, et une reconfiguration du marché pour les acteurs du secteur 40 zettaoctets de données, soit 40000 milliards de gigaoctets, devraient être produites par l'Internet des objets en 2020, selon le cabinet IDC Une illustration parmi d'autres de l'expansion colossale et apparemment sans limite des flux de données générées par les applications quotidiennes Si le - Points de Vue
                La blockchain, nouvel ange-gardien des sociétés d'assurance         
      Secuobs.com : 2016-04-12 16:37:09 - Global Security Mag Online - Née avec l'avènement du bitcoin et des crypto-monnaies, la blockchain - technologie d'enregistrement des transactions sur un grand livre sécurisé, peut trouver des applications dans tous les secteurs d'activités Elle doit contribuer pleinement à la révolution que connaît aujourd'hui le marché de l'assurance et représente à l'évidence une belle opportunité pour réinventer notre industrie Big data, dématérialisation ou encore machine learning, la digitalisation a créé un séisme dans de nombreux pans de - Points de Vue
                 Le secret des Affaires par Maîtres Sabine Marcellin et Thibault du Manoir de Juaye est paru        
      Secuobs.com : 2016-04-11 15:49:26 - Global Security Mag Online - La protection du secret des affaires est au 21ème siècle, ce que le brevet a été pour l'entreprise aux deux siècles précédents A l'ère du big data, l'entreprise est centrée sur la donnée, et les enjeux économiques de l'information deviennent considérables Par conséquent, connaître le régime juridique de la protection du secret des affaires est essentiel pour les professionnels Proposition de directive européenne du 28 novembre 2013 devant être soumise au Parlement en avril 2016, proposition de loi du 16 - Investigations
                Playing with Pi        

      A few months ago I decided to join the party and pickup a Raspberry Pi. It's a $25 full fledged ARM based computer the size of a credit card. There's also a $35 version, of which I ended up buying a handful so far. Due to the cost, this allows you to use a computer in dedicated applications where it otherwise wouldn't be justified or practical. Since then I've been pouring over the different things people have done with their Pi. Here are some that interest me:

      • Setting up security cameras or other dedicated cameras like a traffic cam or bird feeder camera
      • RaspyFi - streaming music player
      • Offsite backup
      • Spotify server
      • Carputer - blackbox for your car
      • Dashcam for my car
      • Home alarm system
      • Digital signage for the family business
      • Console emulator for old school consoles
      • Grocery inventory tracker

      Since the Pi runs an ARM based version of Linux, I'm already familiar with practically everything on that list. The OS I've loaded is Raspbian, a Debian variant. This makes it a lot easier to get up and running with.

      After recently divesting myself of some large business responsibilities, I've had more personal time to dedicate to things like this. Add in the vacation I took during Christmas and New Years and I had the perfect recipe to dive head-first into a Pi project. What I chose was something that I've always wanted.

      The database and Big Data lover in me wants data, lots of it. So I've gone with building a black box for my car that runs all the time the car is on, and logs as much data as I can capture. This includes:

      • OBD2
      • GPS
      • Dashcam
      • and more

      Once you've got a daemon running, and the inputs are being saved then the rest is all just inputs. Doesn't matter what it is. It's just input data.

      My initial goal is to build a blackbox that constantly logs OBD2 data and stores it to a database. Looking around at what's out there for OBD2 software, I don't see anything that's built for long term logging. All the software out there is meant for 2 use cases: 1)live monitoring 2)tuning the ECU to get more power out of the car. What I want is a 3rd use case: long term logging of all available OBD2 data to a database for analysis.

      In order to store all this data I decided to build an OBD2 storage architecture that's comprised of

      • MySQL database
      • JSON + REST web services API
      • SDK that existing OBD2 software would use to store the data it's capturing
      • Wrapping up existing open source OBD2 capture data so it runs as a daemon on the Pi
      • Logging data to a local storage buffer, which then gets synced to the aforementioned cloud storage service when there's an internet connection.

      Right now I'm just doing this for myself. But I'm also reaching out to developers of OBD2 software to gauge interest in adding this storage service to their work. In addition to the storage, an API can be added for reading back the data such as pulling DTS (error) codes, getting trends and summary data, and more.

      The first SDK I wrote was in Python. It's available on GitHub. It includes API calls to register an email address to get an API key. After that, there are some simple logging functions to save a single PID (OBD2 data point such as RPM or engine temp). Since this has to run without an internet connection I've implemented a buffer. The SDK writes to a buffer in local storage and when there's any internet connection a background sync daemon pulls data off the buffer, sends it to the API and removes the item from the buffer. Since this is all JSON data and very simple key:value data I've gone with a NoSQL approach and used MongoDB for the buffer.

      The API is built in PHP and runs on a standard Linux VPS in apache. At this point the entire stack has been built. The code's nowhere near production-ready and is missing some features, but it works enough to demo. I've built a test utility that simulates a client car logging 10 times/second. Each time it's logging 10 different PIDs. This all builds up in the local buffer and the sync script then clears it out and uploads it to the API. With this estimate, the client generates 100 data points per second. For a car being driven an average of 101 minutes per day, that's 606,000 data points per day.

      The volume of data will add up fast. For starters, the main database table I'm using stores all the PIDs as strings and stores each one as a separate record. In the future, I'll evaluate pivoting this data so that each PID has it's own field (and appropriate data type) in a table. We'll see which method proves more efficient and easier to query. The OBD2 spec lists all the possible PIDs. Car manufacturers aren't required to use them all, and they can add in their own proprietary ones too. Hence my ambivalence for now about creating a logging table that contains a field for each PID. If most of the fields are empty, that's a lot of wasted storage. 

      Systems integration is much more of a factor in this project than coding each underlying piece. Each underlying piece, from what I've found, has already been coded somewhere by some enthusiast. The open source Python code already exists for reading OBD2 data. That solves a major coding headache and makes it easier to plug my SDK into it.

      There are some useful smartphone apps that can connect to a Bluetooth OBD2 reader to pull the data. Even if they were to use my SDK, it's still not an ideal solution for logging. In order to log this data, you need a dedicated device that's always on when the car's on and always logging. Using a smartphone can get you most of the way there, but there'll be gaps. That's why I'm focusing on using my Pi as a blackbox for this purpose.


                Re:Politics - USA        
      Huh... (wall-o-text alert!)
      THE SPLIT

      19 Reasons why Democrats will remain divided - and what it means for the party's future.

      Spoiler:
      Throughout most of the 2016 presidential primaries, the media focused on the noisy and reactionary rift among Republicans. Until the battle between Hillary Clinton and Bernie Sanders turned acrimonious in the home stretch, far less attention was paid to the equally momentous divisions within the Democratic Party. The Clinton-Sanders race wasn’t just about two candidates; instead, it underscored a series of deep and growing fissures among Democrats, along a wide range of complex fault lines—from age and race to gender and ideology. And these disagreements won’t fade with a gracious bow-out from Sanders, or a victory in November over Donald Trump. For all the talk of the Democrats’ need for “unity,” it would be a serious mistake to paper over the differences that came to the fore in this year’s primaries. More than ten million Democrats turned out in force this year to reject the party establishment’s cautious centrism and cozy relationship with Wall Street. Unless Democrats heed that message, they will miss a historic opportunity to forge a broad-based and lasting liberal majority.

      To help make sense of what’s causing the split, and where it’s headed, we turned to 23 leading historians, political scientists, pollsters, artists, and activists. Taken together, their insights reinforce the need for a truly inclusive and vigorous debate over the party’s future. “There can be no settlement of a great cause without discussion,” observed William Jennings Bryan, the original Democratic populist insurgent. “And people will not discuss a cause until their attention is drawn to it.”

      It goes way, way back

      BY RICK PERLSTEIN

      The schism between Hillary Clinton and Bernie Sanders is knit into the DNA of the modern Democratic Party, in two interrelated ways. The first is ideological: the battle of left versus right.

      Start in 1924, when the party cleaved nearly in two. That year, at Madison Square Garden, the Democratic convention took a record 103 ballots and 16 days to resolve a fight between the party’s urban wing and its conservative opponents. How conservative? Well, the convention was nicknamed the “Klanbake,” because one of the great issues at stake was—no kidding—whether the KKK was a good or a bad thing. The divide was so heated that tens of thousands of hooded Klansmen held a rally and burned crosses to try to bully the party into meeting their demands.

      Eight years later, under Franklin Roosevelt, the party’s urban, modernist wing established what would become a long hegemony over its reactionary, Southern one. But that hegemony remained sharply contested from the very beginning. In 1937, bipartisan opponents of FDR banded together to forge the “Conservative Manifesto.” Co-authored by a Southern Democrat, the manifesto called for lowering taxes on the wealthy, slashing government spending, and championing private enterprise. Hillary Clinton’s eagerness to please Wall Street can be traced, in part, to that ideological split during the New Deal.

      Indeed, over the years, many of the most “liberal” Democrats have remained sharply conservative on economic questions. Eugene McCarthy, the “peacenik” candidate of 1968, ended up backing Ronald Reagan. Dan Rostenkowski, the lunch-pail chairman of the House Ways and Means Committee, proposed a tax package in 1981 that was more corporate-friendly than Reagan’s. Jerry Brown of California, long derided as “Governor Moonbeam,” campaigned for president in 1992 on a regressive flat tax. That same year, Bill and Hillary Clinton won the White House with the business-funded support of the Democratic Leadership Council, which sought to downplay the “big government” solutions championed by FDR.

      Which brings us to the second strand in the party’s divided DNA: It’s sociological.

      Slate’s Jamelle Bouie has pointed out the pattern’s clocklike consistency: Since the beginning of the modern primary process in 1972, the Democratic divide has settled into a battle between an “insurgent” and the “establishment.” But Bouie errs, I think, in labeling every insurgent as “liberal.” Just look at Brown in 1992—an insurgent who was conservative on economic issues. Or Hubert Humphrey in 1968 and 1972—an establishment favorite whose signature legislative initiatives, including centralized planning boards to dictate industrial production, were more socialist than those of Sanders.

      This year, however, the traditional order of battle aligns with crystalline precision. Clinton, endorsed by 205 out of 232 Democratic members of Congress, is clearly the establishment’s pick—and also, increasingly, that of Wall Street masters of the universe terrified by the prospect of Donald Trump. Sanders represents the guerrilla faction, arrayed this time behind the economically populist banner of FDR.

      Does history tell us anything about how Democrats can bridge their long-running divide and forge a stronger, more unified party? Sanders would do well to remember that sore loserdom never helps. (“George McGovern is going to lose,” a leading Democrat supposedly vowed after Humphrey lost the nomination in 1972, “because we’re going to make him lose.”) And Clinton needs to recognize that campaigning on economic liberalism is almost always a good political bet. (Even at the height of Reagan’s morning-in-America blather in 1984, barely a third of American voters favored his plans to reduce the deficit by slashing social programs.)

      If Hillary has any doubts about embracing the economic agenda laid out by Sanders, she should ask the insurgent of 1992: William Jefferson Clinton. The man who ended a dozen years of presidential exile for the Democrats didn’t do it simply by promising to get tough on crime and to “end welfare as we know it.” He also pledged $80 billion in federal investments to improve America’s cities and to create four million new jobs—not to mention, of course, a plan to deliver health care to all Americans.

      It’s Obama’s fault for raising our hopes

      JACOB HACKER, PROFESSOR OF POLITICAL SCIENCE AT YALE AND CO-AUTHOR OF WINNER-TAKE-ALL POLITICS: We’ve now had almost eight years of a Democratic presidency. And with the exception of the policy breakthroughs in 2009 and 2010, they’ve been viewed as relatively lean years by many in the Democratic Party. There’s a sense of, “We went with someone within the system, and look what happened—Republicans still tried to crush that person. So let’s go for the whole thing.” There’s a sense that supporting the Democratic establishment and going the conventional route hasn’t been that productive.

      MYCHAL DENZEL SMITH, AUTHOR OF INVISIBLE MAN, GOT THE WHOLE WORLD WATCHING: A lot of young people who showed up to vote for Obama were voting for the very first time. But now they’re looking at the ways economic inequality persists, and they’re saying, “Oh, the Democratic Party doesn’t actually stand against that.” They’re looking at the deaths of Trayvon Martin and Michael Brown, the two big linchpins in the Black Lives Matter movement, and they’re like, “Oh, Democrats are actually the architects of the policies that have affected and continue to define young black life in terms of systemic, institutionalized racism.” So you have young folks getting into the Democratic Party and realizing they don’t have a place.

      ASTRA TAYLOR, AUTHOR OF THE PEOPLE’S PLATFORM: TAKING BACK POWER AND CULTURE IN THE DIGITAL AGE: This is in part a symptom of the expectations that people had for the Obama administration that weren’t met. It got its first major expression through Occupy Wall Street, and it’s still playing out. Because nothing has changed, and people know that.

      RUY TEIXEIRA, CO-AUTHOR OF THE EMERGING DEMOCRATIC MAJORITY: You can make the case that Obama has been a very successful and progressive president, but people are impatient. What used to keep people in line, so to speak, when they had these kinds of dissatisfactions was, “Oh, I’m really frustrated, but what can we do? The country is so right-wing. We’ve got to worry about the national debt—there’s no room in the system for change.” Now there’s much more of a sense of possibility. The Democratic Party has contributed to this transformation by becoming more liberal, and by ceasing to be obsessed with the national debt and the deficit.

      ELAINE KAMARCK, SENIOR FELLOW AT THE BROOKINGS INSTITUTION AND AUTHOR OF PRIMARY POLITICS: Here’s the irony—the Bernie people are the Obama people. They’re all the young people; that’s the Obama coalition. They’re frustrated because under Obama, nothing much happened that they liked. They’re taking it out on Hillary, which is unfortunate, since she’s much more capable of making something happen.

      JEDEDIAH PURDY, PROFESSOR OF LAW AT DUKE AND AUTHOR OF AFTER NATURE: The disappointment in Obama took a while to set in. The Obama campaign had the form and rhetoric of transformative politics, but not the substance. Many of us believed or hoped the substance might follow the form; but it didn’t. It turns out you need a program that challenges existing power and aims to reshape it. So Sanders represents the continuation of these insurgent energies. Clinton is also the continuation of Obama, but the Obama of governance, not of the campaign.
      TweetShare

      It’s Hillary’s fault for lowering our hopes

      Ron Haviv / VII for the New Republic
      JOHN JUDIS, FORMER SENIOR EDITOR AT THE NEW REPUBLIC AND CO-AUTHOR OF THE EMERGING DEMOCRATIC MAJORITY: In 1984, you had Walter Mondale, a candidate of the Democratic establishment, pitted against a young upstart, Gary Hart. The split wasn’t left-right—it was young-old, energetic-tired, vision-pragmatism. Bernie, for all his 74 years, represents something still of the rebellious Sixties that appeals to young voters, while Hillary represents a tired incrementalism—utterly uninspiring and rooted largely in identity politics and special interest groups, rather than in any vision for the future.

      The party hasn’t kept up with its base

      JILL FILIPOVIC, LAWYER AND POLITICAL COLUMNIST: The party itself has been stuck in some old ideas for a while. You’ve been seeing movement around the edges, whether from Elizabeth Warren or these grassroots movements for income inequality. The pro-choice movement, for example, is a key part of the Democratic base that has liberalized and modernized and completely changed its messaging in a way that the party is now just catching up to. So you get these internal discords that dredge up a lot of bad feelings.

      DANIELLE ALLEN, DIRECTOR OF THE EDMOND J. SAFRA CENTER FOR ETHICS AT HARVARD: In the last 20 years, we’ve collectively experienced various forms of social acceleration. Rates of change in social dynamics have increased across the spectrum, from income inequality to mass incarceration to immigration to the effects of globalization and the restructuring of the economy. When you have an acceleration of social transformation, there’s a lag problem. The reigning policy paradigms will be out of sync with the actual needs on the ground. That’s what we’re experiencing now.

      JEDEDIAH PURDY: The people who have been drawn to the Sanders campaign have no love for or confidence in elites, Hillary’s habitus. And why should they? They’ve seen growing inequality and insecurity, the naked corruption of politics by oligarchic money, total cynicism in the political class of consultants and pundits, and wars so stupid and destructive that Trump can say as much and win the GOP primaries. There’s a whole world that people are surging to reject.
      TweetShare

      Bernie’s supporters aren’t living in reality


      Mark Peterson / Redux
      DAVID SIMON, CREATOR OF THE WIRE: I got no regard for purism. What makes Bernie so admirable is he genuinely believes everything that comes out of his mouth. It’s incredibly refreshing. If he didn’t have to govern with people who don’t believe what he’s saying, what a fine world it would be.

      I look at the hyperbole from Bernie supporters that lands on my doorstep. Either it’s stuff they believe—in which case they’re drinking the Kool-Aid, so they’re not even speaking in the vernacular of reality. Or what they’re doing is venal and destructive. That level of hyperbole, which Bernie himself is not responsible for, is disappointing. The truth is, it’s not just your friends who have utility in politics—sometimes it’s the people who are against you on every other issue. If you can’t play that game, then what did you go into politics for?

      THEDA SKOCPOL, PROFESSOR OF GOVERNMENT AND SOCIOLOGY AT HARVARD: A lot of Bernie supporters are upper-middle-class people. I’m surrounded by them in Cambridge. I’m not saying they’re hypocritical. I’m just saying they’re overplaying their hand by celebrating his focus on reining in the super-rich as the only way that we can talk about improving economic equality.

      ELAINE KAMARCK: This is part of a bigger problem with American presidential politics selling snake oil to the voters. Everybody from Trump with his stupid fething wall, to Sanders with, “Oh, free college for everybody.” Of all the dumb things—let’s go ahead and give all the rich kids in America a nice break. That’s not progressive, I’m sorry. But people want to believe in Peter Pan. And he’s just not there.

      MARK GREEN, FORMER PUBLIC ADVOCATE OF NEW YORK AND AUTHOR OF BRIGHT, INFINITE FUTURE: A GENERATIONAL MEMOIR ON THE PROGRESSIVE RISE: There’s a lot of adrenaline in primaries between purity and plausibility. Sanders is the most popular insurgent in American history to get this close to a nomination, and to help define the Democratic agenda. I admire his guts to run in the first place, and I get why his combination of Bulworth and Eugene Debs makes him such an appealing candidate. But the programmatic differences between a walking wish list like Sanders and a pragmatic progressive like Clinton are dwarfed by the differences between either of them and the first proto-fascist president.

      There’s a double standard against Hillary

      JILL FILIPOVIC: The dovetailing of gender and wealth in this election is really striking. I don’t remember a lot of Democrats ripping John Kerry to shreds for being wealthy when he ran for president. But it’s been interesting to see Clinton demonized for her Goldman Sachs speeches. For some Democrats, that seems to be inherently disqualifying. Obviously, money would be an issue even if she were a male candidate, because this is an election that’s about income inequality. But the sense that she’s somehow undeserving, that does strike me as gendered.

      THEDA SKOCPOL: Older women support Clinton because they’ve witnessed her career, and she’s always been into economic redistribution. Some Sanders followers have been quite sexist in things they’ve said; that’s very apparent to older women. A friend who studies abortion politics tells me that the nasty tweets she’s gotten from Bernie supporters for backing Hillary are worse than anything she gets from the right wing.

      AMANDA MARCOTTE, POLITICS WRITER FOR SALON:What you’re seeing is a huge drift in the party, away from having our leadership be just a bunch of white men who claim to speak for everybody else. We’re moving to a party that puts women’s interests at the center, that considers the votes of people of color just as valuable as the votes of white people. Unfortunately, some of the support for Sanders comes from people who are uncomfortable with that change and are looking to a benevolent, white patriarch to save them.

      ELAINE KAMARCK: Clinton is being penalized because she has a realistic view of what can be done, and that leads people to mistake her for some kind of bad conservative. She’s not. She’s extraordinarily liberal, particularly on children and families. But because she’s been around a while, when Sanders comes out with this new radical stuff, they think, “Oh, he’s the one whose heart is in the right place.” But listen, she took on Wall Street before he did, in a way that hit their bottom line. If people really want to get something done, they’d vote for her.

      MARK GREEN: Look, there’s a debate I have with my friend Ralph Nader. He sees Hillary as more Wall Street, and I see her as more Wellesley. She’s as smart as anyone, grounded, practical, engaging, and unlike most testosterone-fueled male politicians, actually listens more than lectures. So she’s not as dynamic a candidate as Bill and Barack? Who is? That’s an unfair comparison. But if I had to bet, I’d guess she’ll be as consequential and good a president as either of them.

      Poverty is fueling the divide

      BY KEEANGA-YAMAHTTA TAYLOR

      The Democratic Party today engages in delusional happy talk about economic recovery, while a staggering 47 million Americans are struggling in poverty. As the rich remain as wealthy as ever, working-class people continue to see their wages stagnate. In the 1970s, 61 percent of Americans fell into that vague but stable category of “middle class.” Today that number has fallen to 50 percent. African Americans, the core of the Democratic Party base, continue to be plagued by dead-end jobs and diminished prospects. Fifty-four percent of black workers make less than $15 an hour. Thirty-eight percent of black children live in poverty. More than a quarter of black households battle with hunger.

      This is the heart of the crisis within the Democratic Party. Eight years ago, the party ran on hope: “Yes, we can” and “Change we can believe in.” Pundits openly wondered whether the United States was on the cusp of becoming a “postracial” nation; on the eve of Obama’s first inauguration, 69 percent of black Americans believed that Martin Luther King’s “dream” had been fulfilled. Today, the tune is quite different: Millions of Americans are more disillusioned and cynical than ever about the ability of the state to provide a decent life for them and their families.

      Bernie Sanders tapped into the palpable disgust at America’s new Gilded Age, and it’s a revulsion that will not be quieted with a few platitudes from Hillary Clinton to “give the middle class a raise.” Yet the Democratic leadership continues to treat Sanders as an unfortunate nuisance. The party keeps charging ahead the way it always has, as Clinton pivots to her right to appeal to disgruntled Republican voters. As long as the party has no challengers to its left, the thinking goes, its base has nowhere else to go.

      This strategy may lead Clinton to victory in November. But there is a danger here: In winning the battle, she very well may lose the war being waged within the Democratic ranks. The inattention to growing inequality, racial injustice, and deteriorating quality of life will likely result in ordinary people voting with their feet and simply opting out of the coming election, and future ones as well. Millions of Americans already do not vote, because most elected officials are out of touch with their daily struggles, and because there is little correlation between voting and an improvement in their lives. By continuing to ignore the issues Sanders has raised, Clinton and the rest of the party establishment risk losing a huge swath of the Democratic electorate for years to come.

      There is a way out. More and more voters are identifying as independents. This demonstrates that people want real choices—as opposed to politics driven by sound bites, political action committees, and billionaire candidates. The wide support for both Sanders and Trump points to the incredible vacuum that exists in organized politics. If the movements against police racism and violence were to combine with the growing activism among the disaffected, from low-wage workers to housing advocates, we could build a political party that actually represents the interests of the poor and working class, and leave the Democrats and the Republicans to the plutocrats who already own both parties’ hearts and minds.

      It’s the economy, stupid

      JOHN JUDIS: There have been insurgencies before—George Wallace in ’64 and ’72—that were radical. What made Wallace radical was the split in the party over civil rights. What makes Sanders radical is the lingering rage over the Great Recession.

      If you want to move the question up a level theoretically, you can talk about the failure of “new Democrat” politics to deliver prosperity or economy security. Clinton and the Democrats in Washington don’t understand the level of anxiety that Americans, and particularly the young, feel about their economic prospects. It can’t be addressed by charts showing the drop in the unemployment rate.

      BRETT FLEHINGER, HISTORIAN AT HARVARD AND AUTHOR OF THE 1912 ELECTION AND THE POWER OF PROGRESSIVISM: The Democratic Party has done a poor job of delivering on the economic promises of equality. That’s what’s opened up the possibility for Sanders. It’s what he’s believed in for 20-plus years. But the question is: What’s making it resonate now? It’s the failure of the party to liberalize, since Bill Clinton.

      JACOB HACKER: There’s a feeling of, “Really? This is it? This is the recovery we’ve been promised?” It’s been a long, difficult path since 2008 and the financial crisis. Even Democratic voters who are doing pretty well are feeling that something has gone seriously awry.

      This may be the first time in my life that there’s been a full-throated critique of the Democratic Party as being excessively beholden to money and too willing to work within the system. You saw echoes of this in the Howard Dean campaign, and you saw it much more forcefully in 2000 with Ralph Nader. But Nader was not running within the Democratic Party; he was clearly playing a spoiler role. Whereas Sanders is essentially trying to take the Democratic Party in a different direction.

      JEDEDIAH PURDY: Bernie’s campaign is the first to put class politics at its center. Not poverty, which liberal elites have always been comfortable addressing, and not “We are the 99 percent,” which is populist in a more fantastical sense, but class more concretely: the jobs and communities of blue-collar people, the decline of the middle class, the cost of education.

      MARK HUGO LOPEZ, DIRECTOR OF HISPANIC RESEARCH AT THE PEW RESEARCH CENTER: When you ask Clinton supporters, or people who see Clinton favorably, you’ll find that more than half will say that, compared to 50 years ago, life is better in America today. Whereas among Sanders supporters, one-third will say that things are actually worse.

      Democrats are too fixated on white workers

      JILL FILIPOVIC: The class-based concerns that a lot of the loudest voices in the Sanders contingent of the Democratic Party focus on are the concerns of the white working class, and they aren’t bringing a lot of race analysis into it. The income-inequality argument makes a case, particularly to the white working class, in a way that seems to have alienated African Americans and, to a lesser extent, the Hispanic vote.

      MYCHAL DENZEL SMITH: Look at every demographic breakdown of who votes. The strongest Democratic Party voters are black women. So why is it that you’re so zeroed in and focused on regaining the white working-class vote? What value does that have to you, as opposed to appeasing the voters that are actually there for you? Democrats want it both ways. They want to attract the white working-class voter again, but what they don’t accept is that the reason they lost that voter is because of Republican appeals to racism. So the Democrats want to be the party of anti-racism but also win back the racists. You can’t do that! Why would you want a coalition of those people? It doesn’t make sense.

      Democrats have neglected white workers

      DAVID SIMON: There’s certainly something unique about this moment, and the populist rebellion that has affected both the Republican and Democratic parties. And I think it’s earned. Both parties can be rightly accused, not to the same degree, of having ignored and abandoned the working class and the middle-middle class for the past 30 years.
      TweetShare

      Millennials of color are tired of waiting

      ALAN ABRAMOWITZ, PROFESSOR OF POLITICAL SCIENCE AT EMORY AND AUTHOR OF The Polarized Public? Why American Government Is So Dysfunctional: Why are African Americans so loyal to the Clintons? Part of it is just familiarity. They feel a comfort level with the Clintons, and they really like Bill Clinton, especially older African American voters. But there’s a generational divide even among African American voters. Younger African Americans and Latinos are not as supportive of Clinton.

      MARK HUGO LOPEZ: I was in Chicago recently, and I was surprised when a young Latina college student stood up and described how much she did not like Clinton. She actually said, “I hate Hillary Clinton.” That’s the phrase she used, which drew a round of applause from everybody in the room.

      JOHNETTA ELZIE, A LEADER OF BLACK LIVES MATTER: I don’t think anyone was ready to deal with black millennials. I just don’t believe that anyone in politics who is running on a national scale knows how to address young black or brown people in a way that’s different from how they addressed our elders. Because we’re not the same.

      I remember when Hillary got shut down by some young black students in Atlanta. They wanted to know, “What does she even know about young black people in this neighborhood and what we go through?” John Lewis basically told them, “You need to wait to speak to Hillary. Just be polite, ask questions, yada yada.” And people were like, “But you were a protester before you were a politician! You know what it is, you know the sense of urgency, you know what it means to be told to wait and to know that we don’t have time to wait.”

      MYCHAL DENZEL SMITH: Throughout our history, progressive movements have often left out the idea of ending racism. Then they go to communities of color and say, “What choice do you have but to join with us­—to put aside your concerns about the differences that we experience in terms of racism?” In this election, the movement on the ground has at least pushed Democrats to adopt the language of anti-racism. They’ve had to say things like “institutionalized racism”—they’re learning the language on the fly. The problem is, they understand that they don’t actually have to move on these issues, because they have Trump to run against. All they have to do is say, “Look at how crazy the other option is. Where else are you going to go?”

      Authenticity is gender biased

      BY RIVKA GALCHEN


      Mark Peterson / Redux
      In an early scene in Stendhal’s The Red and the Black, a carpenter’s son hired as a tutor for a wealthy family dons a tailored black suit provided by his new employer. The black suit was a new and radical thing in this era, one in which bakers dressed like bakers, nobility like nobility. In a black suit, one’s social class was cloaked—a form of what back then was often termed hypocrisy.

      Lately, as I’ve followed the contest between Hillary Clinton and Bernie Sanders, I’ve found myself thinking of The Red and the Black, and its play with antiquated notions of authenticity. The passionate support for Sanders has, one hopes, much to do with excitement about his insistent expression of a platform of economic populism. But it would be naïve to think it doesn’t also have to do with his appearance, his way of speaking. There is authenticity, and there is appearing authentic. These two things may mostly align—as they largely but not entirely do with Sanders. (Most anti-establishment figures avoid 35 years in government.) Or they may almost perfectly not align—as in the case of Donald Trump. (A liar celebrated for speaking the truth.) Either way, it’s worth investigating authenticity in our political thinking, both to understand its power and to consider how it helps or hurts the kind of effective, forward-looking agenda that we hope will emerge from a fractured Democratic Party.

      One problem with authenticity as a campaign tactic is its unsettling, subconscious alliance with those who benefit from the status quo. If you’re not who you say you are—if you’re moving on the social ladder, or are not in “your place”—you’re inauthentic. Keeping it real subtly advocates for keeping it just like it is.

      The semiotics of Sanders’s political authenticity—dishevelment, raised voice, being unyielding—are available to male politicians in a way they are not to women (and to whites in a way they are not to blacks or Hispanics or Asians). Black women in politics don’t have the option to wear their hair “natural”; nearly all white women appear to have blowouts, even Elizabeth Warren. It’s nonsense, and yet the only politically viable option, and therefore not nonsense.

      It’s not just that research has shown that women are perceived to talk too much even when they talk less, or that men who display anger are influential while women who do so are not. It’s that there is no such thing as “masculine wiles.” The phrase just doesn’t exist. This doesn’t mean that calling into question Clinton’s authenticity and trustworthiness—the fault line along which the Democratic Party has riven—is pure misogyny. It just means that it’s not purely not misogyny.

      Clinton is often described as the institutional candidate, the establishment. There’s a lot of truth to that. But she’s also the woman who initially kept her name (and her job) as the wife of the governor of Arkansas, who used the role of First Lady as cover to push for socialized health care, and who was instrumental in getting health insurance for eight million children past the Republican gorgons when a full reform failed. Someone who has survived being attacked for nearly 40 years must possess a highly developed sense of what the critic Walter Benjamin calls “cunning and high spirits”—the means by which figures in fairy tales evade the oppressive forces of myth, and mortals evade gods. Somehow she achieved one of the more liberal voting records in the Senate, despite rarely being described as a liberal by either the left or the right.

      Perhaps one reason that Clinton’s “firewall” of black support has remained standing is that “authenticity” has less rhetorical force with a historically oppressed people, for whom that strategy—being recognizably who people in power think you ought to be—was never viable. There are, of course, important and substantial criticisms of Clinton. But perhaps when we say that Hillary is inauthentic, we’re simply saying that she is a woman working in the public eye.

      Democrats on both sides of the party should consider which tactic best suits the underdogs they feel they are defending, and want to defend. Whoever receives the nomination, perhaps the worry should shift from whether the candidate is cunning to whether the candidate—and the Democratic Party—can be cunning enough.

      The disruption is digital

      BY ZEYNEP TUFEKCI

      Insurgents like Bernie Sanders have been the rule, not the exception, in the modern era of Democratic politics. From Eugene McCarthy to Jesse Jackson, the party’s left wing regularly broke ranks to run on quasi-social democratic platforms. But with the exception of George McGovern in 1972, these challengers all fell short of the nomination, partly because they lacked the money to effectively organize and advertise. The party establishment had a virtual monopoly on every political tool needed to win.

      Slowly at first—and then with a big, loud bang—digital technologies changed all that. First came Howard Dean, who used the internet to “disrupt” the Democratic Party in 2004. Powered by small online donations and digitally organized neighborhood “meetups,” Dean outraised his big-money rivals and revolutionized the way political campaigns are funded. Four years later, Barack Obama added a digitally fueled ground game to Dean’s fund-raising innovations, creating a campaign machine that could identify and turn out voters with a new level of accuracy. But when Obama’s policies fell short of the left’s expectations, many turned their energies to building a different kind of digital rebellion—this time, outside of electoral politics.

      Sparked by a single email in June 2011, Occupy Wall Street exploded in a matter of months into a worldwide movement that mobilized massive street protests—including many who’d sworn off partisan politics as hopelessly corrupted. Occupy demonstrated how the masses could organize without a campaign or a candidate to rally around, opening a space that would soon be joined by Black Lives Matter and other activist groups. It also unleashed a populist fervor on the left. As the 2016 campaign approached, Occupy veterans joined forces with left-leaning activists inside the party. Instead of rejecting traditional politics, they decided to disrupt the Democratic primaries, the way Tea Party activists did to the GOP in 2010 and 2012.

      In some ways, it didn’t matter that Sanders was the candidate they rallied behind. His ideological consistency earned him the trust of the left, and they in turn stoked his online fund-raising—producing the flood of $27 average donations that kept him competitive with Hillary Clinton. In the spirit of Occupy, Sanders’s digital operation was more volunteer-driven and dispersed than Obama’s; instead of “Big Data,” the watchword for Sanders was “Big Organizing,” as hundreds of thousands of volunteers effectively ran major parts of the show. A pro-Sanders Reddit group attracted almost a quarter-million subscribers, who helped organize everything from voter-registration drives to phone banks. A legion of young, pro-Sanders coders on Slack produced apps to mobilize volunteers and direct voters to the polls. There was even a BernieBNB app, where people could offer their spare couches to #FeelTheBern organizers.

      Ultimately, the Sanders campaign became a lesson in both the potential and the limitations of a digitally fueled uprising. It seems miraculous that a 74-year-old democratic socialist could come so close to beating a candidate with Clinton’s institutional advantages. But Sanders’s superior digital reach couldn’t help him win over African Americans and older women, most of whom favor Clinton. And all his fans on social media could not alter the mainstream media’s narrative that this was yet another noble but doomed insurgency.

      Whether or not Clinton wins in November, it’s safe to expect another Democratic insurgency in 2020—and beyond. Digital fund-raising, organizing, and messaging have given the left the weapons not just to tilt at the establishment’s windmills, but to come close to toppling them. Next time, they might just succeed.

      Split? What split?

      RUY TEIXEIRA: I don’t see differences massive enough to provoke any kind of split that has serious consequences. It’s just part of an ongoing shift in the Democratic Party. The party is going to continue to consolidate behind a more aggressive and liberal program, and the Sanders people are a reflection of that. We shouldn’t lose track of the fact that Clinton will be the most liberal presidential candidate the Democrats have run since George McGovern.

      BRETT FLEHINGER: In historic terms I don’t think this party is split. I don’t even think the divide is as big as it was in 2000, when a significant portion of Democratic voters either considered Ralph Nader or voted for Nader.

      ALAN ABRAMOWITZ: It’s easy to overstate how substantial the divide is. Some of it is more a matter of style, the sense that Clinton and some of these longtime party leaders are tainted by their ties to Wall Street and big money. But it’s not based so much on their issue positions, because Clinton’s issue positions are pretty liberal. Not as far left as Bernie—but then, nobody’s as far left as Bernie. Part of it is a distortion, because you can’t get to Bernie’s left, except maybe on the guns issue. So Bernie can always be the one taking the purist position.

      THEDA SKOCPOL: This isn’t a revolution. The phenomenon of having a left challenger to somebody called an establishment Democrat goes way back. It’s been happening my whole life, and I’m not a child. It’s never successful, except in the case of Obama. And Obama had something that the other challengers didn’t: He was able to appeal to blacks. Most of these left candidates appeal to white liberals, and Sanders is certainly in that category. His entire base is white liberals.

      KEVIN BAKER, AUTHOR OF THE NOVEL STRIVERS ROW: Democrats have almost always been the party that co-opts and brings in literal outsiders and outside movements. In the late nineteenth century, it was a bizarre coalition between Southern bourbon planters and big-city machines, which each had their own grievances. Then it was an uneasy coalition between those same machines and the agrarian populists brought in by William Jennings Bryan. Then you had the Grand Coalition, the biggest, most diverse coalition in American history, which was the New Deal one: farmers and workers, urbanites and Main Street progressives, blacks, whites, feminists, unionists. It lasted a long time, until it broke down over race and the Vietnam War in the 1960s. Finally, you had the rise of the Democratic Leadership Council and the Clinton-ite and Obama-ite version of more conservative progressivism. But what that coalition left unanswered, for a lot of people in the party and in the country, was just how they were going to make a living in this new world. What we’re seeing now is a very civil contest, relatively speaking, over who is going to lead that coalition.

      Don’t worry: Trump will unite us


      Mark Peterson / Redux
      JOHN JUDIS: Whatever shortcomings Clinton’s campaign has in creating unity are likely to be overcome by the specter of a Trump America.

      RUY TEIXEIRA: I don’t see the people who support Sanders, particularly the young people, as being radically different from the Clinton folks in terms of what they support. They’ll wind up voting for Hillary when she runs against Trump.

      DAVID SIMON: If you’re asking me if I think the Democratic Party will heal in the general election, I think it will. Trump helps that a lot. The risks of folding your arms and walking away are fundamental, in a way they might not be with a more viable and coherent candidate. But let’s face it, the idea of this man at the helm of the republic is some scary gak.

      Bernie isn’t the future, but his politics are

      ALAN ABRAMOWITZ: Younger voters are the future of the Democratic Party. But Bernie Sanders is not the future of the Democratic Party. The question is: Who’s going to come along who can tap into that combination of idealism and discontent that he represents?

      JOHN JUDIS: Sanders is an old guy, like I am, and not one, I suspect, to build a movement. And I think “movement” is probably the wrong word. What inspires movements is particular causes (Vietnam, civil rights, high taxes) or a party in power that is seen as taking the wrong stance on those issues (George W. Bush for liberals, Barack Obama for Republicans). If Clinton is the next president, I don’t expect a movement to spring up. Instead, I’d expect to see caucuses within the party that take a Bernie Sanders/Elizabeth Warren point of view. But if Trump wins, you will see a movement, whatever Sanders does.

      JACOB HACKER: There’s a growing chunk of the Democratic electorate that believes the existing policy ideas that define the mainstream of the party don’t go far enough. The question becomes: What do those folks do after the election? What kind of force will they be within the party going forward? Can they form a strong movement that will press national politicians to move to the left, the way the Tea Party did on the right?

      If a Democrat wins in November, you probably can’t get a movement like the Tea Party under Obama, or Move On under Bush. But what you could get—what you would hope to get—is a true grassroots, longer-term movement that tries to move the center of gravity of American politics to the left.

      JEDEDIAH PURDY: But what would a movement built out of Sanders supporters be for, exactly? The campaign itself gives some answers. The Sanders campaign is much more distinct from the Clinton campaign, in substance, than Obama’s first campaign was. The Fight for $15, single-payer health care, stronger antitrust law, free college: These are huge, concrete goals. If people can organize around one guy who expresses them but, if elected, could do very little unless we also changed Congress, then we should be able to organize around them to try to change the makeup of political structures from top to bottom. Maybe we need to move into our local Democratic parties. The Moral Majority took over school boards with a specific agenda they could implement. Are there electoral institutions, as well as party institutions, that we should be aiming to reshape in our image?

      DANIELLE ALLEN: It’s a huge opportunity for Democrats, if they can take all the incoming young participants seriously and give them a real role in digging into hard policy questions. This is a chance to cultivate leaders who can run for office across the landscape—not just national office, but local office. The Republicans have done a much better job, in all honesty, at growing up a generation of younger politicians. Democratic politicians skew older, so that sums up the real question about the Sanders moment: Is this enough of a wake-up call to the Democratic Party to start bringing talent in?

      It’s a trap!

      ASTRA TAYLOR: The young thing, this millennial left turn, is great. But there’s a part of me that’s afraid. In the 1960s, the story was the counterculture and the new left. It was Students for a Democratic Society, the civil rights movement, the war in Vietnam. But there’s been a lot of smart revisionist scholarship that says the story of the ’60s was not the new left, it was actually the new right, which spent the decade laying the groundwork for its resurgence. At this moment, when left-wing millennials are getting a lot of attention, my fear is that there’s a conservative counterpoint that I’m just not seeing, because we’re all in our little social and political bubbles. We should study the split between the new left and the new right in the ’60s, and make sure that history doesn’t repeat itself.

      The worst thing would be to ignore the split

      DAVID SIMON: The Democrats are going to win, because they’re up against Trump. But I’m worried they’re going to paper over a fundamental flaw in their coalition, which is: You’ve got to help working people and the middle-middle class. They’re not your guaranteed votes, and you lost them once to Reagan. Maybe you can do without them long-term. But I would get them back because (a) it secures your coalition going forward and (b) it’s the right thing to fething do.

      JILL FILIPOVIC: The brawls that people are having on Twitter every day—I don’t know if that’s healthy for the party. But the bigger debates are really important conversations to be having. Who is our coalition? Who are we representing, and how do we best do that? Do we want to be the center-left party of the ’90s, or should we be serving a more diverse and liberal voter base? I don’t think those conversations are going to destroy the party. I think they’re going to set us in a better direction.

      JACOB HACKER: It’s nice to be able to talk about what’s happening on the Democratic side, because all of the focus has been on the Republican side. It’s a bit like living in a house that’s got some peeling paint and holes in the roof. Right next to it is a derelict building that’s practically falling over. And you’re like, “Man, I’ve got a nice house.” But if you just put your hand up and cover up your neighbor’s house so you can’t see it, you’d be like, “Um, I think my house needs some work.” The Democratic Party is kind of like that right now. I want to live there, but I really would love to upgrade it.

      The best is yet to come

      BY NAOMI KLEIN


      Mark Peterson / Redux
      On the surface, the battle between Hillary Clinton and Bernie Sanders looks like a deep rift, one that threatens to splinter the Democratic Party. But viewed in the sweep of history, it is evidence of something far more positive for the party’s base and beyond: not a rift but a shift—the first tremors of a profound ideological realignment from which a transformative new politics could emerge.

      Many of Bernie’s closest advisers—and perhaps even Bernie himself—never imagined the campaign would do so well. And yet it did. The U.S. left—and not some pale imitation of it—actually tasted electoral victory, in state after state after state. The campaign came so close to winning that many of us allowed ourselves to imagine, if only for a few, furtive moments, what the world would look like with a President Sanders.

      Even writing those words seems crazy. After all, the working assumption for decades has been that genuinely redistributive policies are so unpopular in the U.S. that they could only be smuggled past the American public if they were wrapped in some sort of centrist disguise. “Fee and dividend” instead of a carbon tax. “Health care reform” instead of universal public health care.

      Only now it turns out that left ideas are popular just as they are, utterly unadorned. Really popular—and in the most pro-capitalist country in the world.

      It’s not just that Sanders has won 20-plus contests, all while never disavowing his democratic socialism. It’s also that, to keep Sanders from hijacking the nomination, Clinton has been forced to pivot sharply to the left and disavow her own history as a market-friendly centrist. Even Donald Trump threw out the economic playbook entrenched since Reagan—coming out against corporate-friendly trade deals, vowing to protect what’s left of the social safety net, and railing against the influence of money in politics.

      Taken together, the evidence is clear: The left just won. Forget the nomination—I mean the argument. Clinton, and the 40-year ideological campaign she represents, has lost the battle of ideas. The spell of neoliberalism has been broken, crushed under the weight of lived experience and a mountain of data.

      What for decades was unsayable is now being said out loud—free college tuition, double the minimum wage, 100 percent renewable energy. And the crowds are cheering. With so much encouragement, who knows what’s next? Reparations for slavery and colonialism? A guaranteed annual income? Democratic worker co-ops as the centerpiece of a green jobs program? Why not? The intellectual fencing that has constrained the left’s imagination for so long is lying twisted on the ground.

      This broad appetite for systemic change did not begin with Sanders. During the Obama years, a wave of radical new social movements emerged, from Occupy Wall Street and the Fight for $15 to #NoKXL and Black Lives Matter. Sanders harnessed much of this energy—but by no means all of it. His weaknesses reaching certain segments of black and Latino voters in the Democratic base are well known. And for some activists, Sanders has always felt too much like the past to get overly excited about.

      Looking beyond this election cycle, this is actually good news. If Sanders could come this far, imagine what a left candidate who was unburdened by his weaknesses could do. A political coalition that started from the premise that economic inequality and climate destabilization are inextricable from systems of racial and gender hierarchy could well build a significantly larger tent than the Sanders campaign managed to erect.

      And if that movement has a bold plan for humanizing and democratizing new technology networks and global systems of trade, then it will feel less like a blast from the past, and more like a path to an exciting, never-before-attempted future. Whether coming after one term of Hillary Clinton in 2020, or one term of Donald Trump, that combination—deeply diverse and insistently forward-looking—could well prove unbeatable.


      The "It’s a trap!" one is something else....
                Loulou Cherinet om utanförskap, förfall och Big Data        
      Loulou Cherinet är född 1970 och har ställt ut på dom flesta konstbiennaler. Med sina dubbla rötter både i Sverige och i Etiopien arbetar hon mestadels på det internationella fältet. Hon är också professor på Konstfack i Stockholm, och just nu aktuell med fem videoverk på Moderna Museet, under titeln Who Learns My Lesson Complete? ett citat från poeten Walt Whitman.
                When is Network Lasso Accurate?, The Network Nullspace Property for Compressed Sensing of Big Data over Networks, Semi-Supervised Learning via Sparse Label Propagation        



      Alex just mentioned this to me:

      I started to translate recovery conditions from compressed sensing into the graph signal setting. So far, I managed to relate the null space property and variants of the restricted isometry properties to the connectivity properties (topology) of networks. In particular, the conditions amount to the existence of certain network flows​. I would be happy if you have a look and share your opinion with me:
      Greetings from Finland, Alex

      Thanks Alex ! Here are the preprints:


      A main workhorse for statistical learning and signal processing using sparse models is the least absolute shrinkage and selection operator (Lasso). The Lasso has been adapted recently for massive network-structured datasets, i.e., big data over networks. In particular, the network Lasso allows to recover (or learn) graph signals from a small number of noisy signal samples by using the total variation semi-norm as a regularizer. Some work has been devoted to studying efficient and scalable implementations of the network Lasso. However, only little is known about the conditions on the underlying network structure which ensure a high accuracy of the network Lasso. By leveraging concepts of compressed sensing, we address this gap and derive precise conditions on the underlying network topology and sampling set which guarantee the network lasso to deliver an accurate estimate of the entire underlying graph signal.


      We adapt the nullspace property of compressed sensing for sparse vectors to semi-supervised learning of labels for network-structured datasets. In particular, we derive a sufficient condition, which we term the network nullspace property, for convex optimization methods to accurately learn labels which form smooth graph signals. The network nullspace property involves both the network topology and the sampling strategy and can be used to guide the design of efficient sampling strategies, i.e., the selection of those data points whose labels provide the most information for the learning task.


      This work proposes a novel method for semi-supervised learning from partially labeled massive network-structured datasets, i.e., big data over networks. We model the underlying hypothesis, which relates data points to labels, as a graph signal, defined over some graph (network) structure intrinsic to the dataset. Following the key principle of supervised learning, i.e., similar inputs yield similar outputs, we require the graph signals induced by labels to have small total variation. Accordingly, we formulate the problem of learning the labels of data points as a non-smooth convex optimization problem which amounts to balancing between the empirical loss, i.e., the discrepancy with some partially available label information, and the smoothness quantified by the total variation of the learned graph signal. We solve this optimization problem by appealing to a recently proposed preconditioned variant of the popular primal-dual method by Pock and Chambolle, which results in a sparse label propagation algorithm. This learning algorithm allows for a highly scalable implementation as message passing over the underlying data graph. By applying concepts of compressed sensing to the learning problem, we are also able to provide a transparent sufficient condition on the underlying network structure such that accurate learning of the labels is possible. We also present an implementation of the message passing formulation allows for a highly scalable implementation in big data frameworks.





      Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
      Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

                21.09.2017: DOAG 2017 Big Data Days        
      Data Lakes, Analytics-Lösungen, Datenbank-Programmierung, Künstliche Intelligenz
                Scientists turn to big data in hunt for minerals, oil and gas        

      OSLO: Scientists searching for everything from oil and gas to copper and gold are adopting techniques used by companies such as Netflix or Amazon to sift through vast amounts of data, a study showed on Tuesday.

      The method has already helped to discover 10 carbon-bearing minerals and could be widely applied to exploration, they wrote in the journal American Mineralogist. "Big data points to new minerals, new deposits," they wrote of the findings.

      The technique goes beyond traditional geology by amassing data about how and where minerals have formed, for instance by the cooling of lava after volcanic eruptions. The data can then be used to help find other deposits.

      "Minerals occur on Earth in clusters," said Robert Hazen, executive director of the Deep Carbon Observatory at the Carnegie Institution for Science in Washington and an author of the study.

      "When you see minerals together it's very like the way that humans interact in social networks such as Facebook," he said.

      Hazen said the technique was also like Amazon, which recommends books based on a client's previous orders, or by media streaming company Netflix, which proposes movies based on a customer's past viewing habits.

      "They are using vast amounts of data and make correlations that you could never make," he told Reuters.

      Lead author Shaunna Morrison, also at the Deep Carbon Observatory and the Carnegie Institution, said luck often played a big role for geologists searching for new deposits.

      "We are looking at it in a much more systematic way," she said of the project.

      Among the 10 rare carbon-bearing minerals discovered by the project were abellaite and parisaite-(La). The minerals, whose existence was predicted before they were found, have no known economic applications.

      Gilpin Robinson, of the U.S. Geological Survey (USGS) who was not involved in the study, said the USGS had started to collaborate with the big data project.

      "The use of large data sets and analytical tools is very important in our studies of mineral and energy resources," he wrote in an email.

      The DCO project will also try to collect data to examine the geological history of the Moon and Mars.


                Author Todd Lyle talks Big Data and security        
      none
                EPISODE67 - Big Data Servers        
      This is a podcast with Joseph George, Director of Big Data Servers, HP talking about the SL4500 servers from HP and other associated solutions for customers looking to store and process massive amounts of data.
                EPISODE9 - Dave Fellows, CTO of GreenButton        
      Dave Fellows, CTO of GreenButton, talks about high performance computing and big data in the cloud with GreenButton Cloud Fabric.
                Mit Leonardo zielt SAP auf das Internet of Things        
      Neue Werkzeuge und Services aus der Produktreihe SAP Leonardo hat die SAP vorgestellt. Die Lösungen sollen Unternehmen den Einsatz von Cloud-, Big Data- und dem Internet of Things ermöglichen. Auf der Konferenz Leonardo Live hat SAP eine Reihe von Services und Produkte im Umfeld von SAP Leonardo vorgestellt. Diese Lösung basierte auf der hauseigenen Cloud Platform und biete eine standardisierte Weiter lesen
                Big Data, and Volume, Velocity, and Variety        
      The terms “volume”, “velocity”, and “variety” turn up often – very often – in discussions about “Big Data”.  Gartner’s Doug Laney has laid claim to the original collective use of these terms, in a then-META Group article in February 2001 (“3D Data Management: Controlling Data Volume, Veocity, and Variety“). If Mr. Laney is indeed correct […]
                MarketFacts Quarterly - Conhecimento Iterativo: Criando uma Estratégia Útil de Big Data        
      none
                MarketFacts Quarterly - Conocimiento Iteractivo: Para Crear una Estratégia Útil de Big Data        
      none
                MarketFacts Quarterly - Iterative Insight: Creating a Big Data Strategy That Works for Your Organization        
      none
                Big Data Analytics nos Serviços Financeiros (Research Briefing)        
      none
                Big Data Analytics en los Servicios Financieros (Research Briefing)        
      none
                Senior Global Relationship Manager-GES        
      NM-Albuquerque, CenturyLink (NYSE: CTL) is a global communications and IT services company focused on connecting its customers to the power of the digital world. CenturyLink offers network and data systems management, big data analytics, managed security services, hosting, cloud, and IT consulting services. The company provides broadband, voice, video, advanced data and managed network services over a robust 265,
                Eclipse Newsletter - BIRT and Big Data        
      Find out how to use BIRT to visualize data from Hadoop, Cassandra and MongoDB in this month's issue of the Eclipse Newsletter.
                Eclipse Survey: Big data, reporting and visualization        
      This survey is now closed. Thank you to everyone who contributed!
                International Conference on Cloud, Big Data and Trust (ICCBDT - 2013)        
      I was too much fascinated towards semantic web technology before starting my Post Graduation, these days cloud computing fascinating me. I wonder when a significant approach developed to perceive, to learn and an innovative nature; but i think i got much from these academic years. Now i want to grow my knowledge upto next level, i want to meet, listen and work with experienced persons of IT industry. For that ICCBDT is looking as an opportunity for me. Definitely each event related to our field specially subject is related to your fascinated area will make excited to attained.

      Cloud computing is not quite new technology in industry but still newbie don't know what it is actually. Sometimes little knowledge also causes technology adoption failure, nobody adopt any technology until it is beneficial to him. There is a big problem with cloud is that everyone think that cloud computing is insecure.
      Yes, security is a big issue, but if we know any threat then thread handling can be easily provided. I also have some questions about cloud computing in my mind. I will write more about these issues after attaining this event, waiting for that. If you also have some queries then you should attained this International conference which is organised by same university in which i am studying that is RGPV, Bhopal. The conference will held from 13th November 2013 to 15th November in Bhopal  in collaboration of EMC2.
      for more detail About this event you can visit: http://www.rgpv.ac.in/ICCBDT/index.html

                The Evolution of IoT Analytics and Big Data        
      In this e-guide, Sisense co-founder Adi Azaria and Machina Research analyst Emil Berthelsen examine the current state of IoT analytics as well as where it's heading next. Learn about the impact IoT is having on everything from Hadoop big data platforms to real-time BI capabilities. Published by: Vitria Technology, Inc.
                Building an Enterprise Data Hub, Evaluating Infrastructure        
        Want to get the most out of your big data? Build an enterprise data hub (EDH). Big data is rapidly getting bigger. That in itself isn’t a problem. The issue is what Gartner analyst Doug Laney describes as the three Vs of Big Data: volume, velocity, and variety.   Volume refers to the ever-growing […]
                Elasticsearch ELK + Cisco UCS turn massive data into massive insights        
      The Internet of Everything continues to gain momentum and every new connection is creating new data. Cisco UCS Integrated Infrastructure for Big Data is helping customers convert that data into powerful intelligence, and we’re working with a number of new partners to bring exciting new solutions to our customers. Today, I want to spotlight Elasticsearch, […]
                Unlock The Value of Big Data with Cisco Unified Computing System        
      Big Data is not just about gathering tons of data, the digital exhaust from the internet, social media, and customer records.  The real value is in being able to analyze the data to gain a desired business outcome.   Those of us who follow the Big Data market closely never lack for something new to […]
                Cisco UCS and ACI Infrastructure Innovations for Big Data at Hadoop Summit        
      Big Data remains one of the hottest topics in the industry due to the actual dollar value that businesses are deriving from making sense from tons of structured and unstructured data.  Virtually every field is leveraging a data-driven strategy as people, process, data and things are increasing being connected (Internet of Everything). New tools and […]
                Cisco, Cloudera and The Future of Data Management – on Display at Cloudera Sessions        
        The Cloudera Sessions Roadshow helps companies to navigate the Big Data journey.  As Hadoop takes the data management market by storm, organizations are evolving the role it plays in the modern data center. This disruptive technology is quickly transforming an industry, the value it adds to the modern data center, and how you can […]
                Maximizing Big Data Performance and Scalability with MapR and Cisco UCS        
      Huge amounts of information are flooding companies every second, which has led to an increased focus on big data and the ability to capture and analyze this sea of information. Enterprises are turning to big data and Apache Hadoop in order to improve business performance and provide a competitive advantage. But to unlock business value […]
                Get More out of your Data with Cisco at Strata Hadoop World October 28 – 30, 2013        
        With enough hype to rival even the most popular of Superbowl’s, Big Data experts will converge on New York City in just a couple weeks!  But big data has good reason for all the hype as businesses continue to find new ways to leverage the insights derived from vast data pools that are continuing […]
                Design Zone Webinar: Unlock More Value from Big Data with Cisco UCS        
          Big Data has become mainstream as businesses realize its benefits, including improved operation efficiency, better customer experience, and more accurate predictions. However, companies are often challenged by the complexities of traditional server solutions. In this webinar, learn how to unlock the value of Big Data with the Cisco Unified Computing System (Cisco UCS). […]
                Join Cisco on the Road to Big Data        
            Cloudera Sessions is coming to a City Near You! Have you registered for the upcoming Cloudera Sessions roadshow yet?  According to IDC Analysts, the market for Big Data will reach $16.9 billion by 2015, with an outrageous 40% CAGR. As the sheer volume of data continues to climb, enterprise customers will need […]
                Will big data provide a cure-all for ailing health budgets?        
      Smiling senior man having measured blood pressure

      As soaring health budgets continue to cause pain for governments and medical care providers, relief may be in sight thanks to big data. A new study by Lux Research found that advanced big data and analytics technologies are poised to help rein in runaway healthcare costs. The report “Industrial Big Data and Analytics in Digital... Read more »

      The post Will big data provide a cure-all for ailing health budgets? appeared first on ReadWrite.


                Practical Data Science & Big Data/Data Analytics Online Trainingg @ Sequelgate (hyderabad)        
      SequelGate is one of the best training institutes for Data Science & Big Data /Data Analytics Training . We have been providing Classroom and Classroom Trainings and Corporate training. All our training sessions are COMPLETELY PRACTICAL. DATA S...
                The Failure of Cyber Security and the Startups Who Will Save Us        
      2014 will be remembered as the year the cyber dam broke, breached by sophisticated hackers who submerged international corporations and government agencies in a flood of hurt. Apple, Yahoo, PF Changs, AT&T, Google, Walmart, Dairy Queen, UPS, eBay, Neiman Marcus, US Department of Energy and the IRS all reported major losses of private data relating to customers, patients, taxpayers and employees. Breaches at Boeing, US Transportation Command, US Army Corps of Engineers, and US Investigations Services (who runs the FBI’s security clearance checks) reported serious breaches of national security. Prior to last year, devastating economic losses had accrued only to direct targets of cyberwarfare, such as RSA and Saudi Aramaco, but in 2014, at least five companies with no military ties -- JP Morgan, Target, Sony, Kmart, and Home Depot – incurred losses exceeding $100M from forensic expenses, investments in remediation, fines, legal fees, re-organizations, and class-action lawsuits, not to mention damaged brands.

      The press has already reported on where things went wrong at each company, promoting a false sense of security based on the delusion that remediating this vulnerability or that one would have prevented the damage. This kind of forensic review works for aviation disasters, where we have mature, well understood systems and we can fix the problems we find in an airplane. But information networks are constantly changing, and adversaries constantly invent new exploits. If one doesn’t work, they simply use another, and therein lies the folly of forensics.

      Only when you step back and look at 2014 more broadly can you see a pattern that points toward a systemic failure of the security infrastructure underlying corporate networks, described below. So until we see a seismic shift in how vendors and enterprises think about security, hackers will only accelerate their pace of “ownership” of corporate and government data assets.

      The Sprawl of Cyberwarfare

      The breaches of 2014 demonstrate how cyberwarfare has fueled the rampant spread of cyber crime.

      For the past decade, the world’s three superpowers, as well as UK, North Korea and Israel, quietly developed offensive capabilities for the purposes of espionage and military action. Destructive attacks by geopolitical adversaries have clearly been reported on private and public sector targets in the US, Iran, South Korea, North Korea, Israel, Saudi Arabia and elsewhere. While Snowden exposed the extent of cyber espionage by the US, no one doubts that other nations prowl cyberspace to a similar or greater extent.

      The technical distinction of these national cyber agencies is that they developed the means to target specific data assets or systems around the world, and to work their way through complex networks, over months or years, to achieve their missions. Only a state could commit the necessary combination of resources for such a targeted attack: the technical talent to create zero-day exploits and stealthy implants; labs that duplicate the target environment (e.g. the Siemens centrifuges of a nuclear enrichment facility); the field agents to conduct on-site ops (e.g. monitoring wireless communications, finding USB ports, or gaining employment); and years of patience. As a result of these investments in “military grade” cyber attacks, the best of these teams can boast a mission success rate close to 100%.

      But cyber weapons are even harder to contain than conventional ones. Cyberwar victories have inspired terrorists, hacktivists and criminals to follow suit, recruiting cyber veterans and investing in the military grade approach. (Plus, some nations have started targeting companies directly.) No longer content to publish malware and wait for whatever data pop up, criminals now identify the crown jewels of businesses and target them with what we call Advanced Persistent Threats (APTs). You want credit cards? Get 56 million of them from Home Depot. You want to compromise people with the most sensitive secrets? Go to straight to the FBI’s archive of security clearances. You want the design of a new aircraft? Get it from Boeing. You need data for committing online bank theft? Get it for 76 million households at JP Morgan Chase.

      That’s why cyberspace exploded in 2014.

      This is Not the Common Cold

      But why are the crown jewels so exposed? Haven’t these companies all spent millions of dollars every year on firewalls, anti-virus software, and other security products? Don’t their IT departments have security engineers and analysts to detect and deflect these attacks?

      The problem is that up until this year, corporate networks were instrumented to defend against generic malware attacks that cause minimal damage to each victim. Generic malware might redirect your search page, crash your hard drive, or install a bot to send spam or mine bitcoin. It’s not looking for your crown jewels because it doesn’t know who you are. It may worm its way to neighboring machines, but only in a singular, rudimentary way that jumps at most one or two hops. It’s automated and scalable – stealing pennies from all instead of fortunes from a few. If it compromises a few machines here and there, no big deal.

      But with Advanced Persistent Threats, a human hacker directs the activity, carefully spreading the implant, so even the first point of infection can lead to devastation. These attacks are more like Ebola than the common cold, so what we today call state-of-the-art security is only slightly more effective than taking Airborne (and that’s a low bar). As long as corporate networks are porous to any infection at all, hackers can launch stealth campaigns jumping from host to host as they map the network, steal passwords, spread their agents, and exfiltrate data. Doubling down on malware filters will help, but it can never be 100% effective. All it takes is one zero-day exploit, or a single imprudent click on a malicious email, tweet or search result, for the campaign to begin. Or the attacker can simply buy a point of entry from the multitudes of hackers who already have bots running on the Internet.

      Too Big Data

      The dependence on malware filters is only half the problem. Ask any Chief Information Officer about his or her security infrastructure and you will hear all about the Secure Operation Center in which analysts pour over alerts and log files  (maybe even 24/7) identifying anomalies that may indicate security incidents. These analysts are tasked with investigating the incidents and rooting out any unauthorized activity inside the network. So even if someone can trespass the network, analysts will stop them. And indeed, thousands of security products today participate in the ecosystem by finding anomalies and generating alerts for the Security Information and Event Management (SIEM) system. Every week a new startup pops up, touting an innovative way to plow through log files, network stats, and other Big Data to identify anomalies.

      But sometimes anomalies are just anomalies, and that’s why a human analyst has to investigate each alert before taking any pre-emptive action, such as locking a user out of the network or re-imaging a host. And with so many products producing so many anomalies, they are overwhelmed with too much data. They typically see a thousand incidents every day, with enough time to investigate twenty. (You can try to find more qualified analysts but only with diminishing returns, as each one sees less of the overall picture.)

      That’s why, for example, when a FireEye system at Target spotted the malware used to exfiltrate 40 million credit cards, it generated an alert for the Secure Operations Center in Minneapolis, and nothing happened. Similarly, a forensic review at Neiman Marcus revealed more than 60 days of uninvestigated alerts that pointed to exfiltrating malware. SONY knew they were under attack for two years leading up to their catastrophic breach, and still they couldn’t find the needles in the haystack.

      And yet, the drumbeat marches on, as security vendors old and new continue to tout their abilities to find anomalies.  They pile more and more alerts into the SIEM, guaranteeing that most will drop on the floor. No wonder APTs are so successful.

      A Three Step Program

      "Know Thy Self, Know Thy Enemy" - Sun Tzu, The Art of War

      We need to adapt to this new reality, and the cyber security industry needs to enable it. Simply put, businesses need to focus their time and capital on stopping the most devastating attacks.

      The first step here is to figure out what those attacks look like. What are your crown jewels? What are the worst case scenarios? Do you have patient data, credit cards, stealth fighter designs, a billion dollars in the bank, damning emails, or a critical server that, if crippled by a Distributed Denial of Service attack, would cause your customers to instantly drop you? As you prioritize the threats, identify your adversaries. Is it a foreign competitor, Anonymous, disgruntled employees, or North Korea? Every business is different, and each has a different boogeyman. The good news is that even though most CEO’s have never thought about it, this first step is easy and nearly free. (Cyber experts like Good Harbor or the BVP-funded K2 Intelligence can facilitate the process.)

      Second, businesses need real-time threat intelligence that relate to their unique threatscapes. Almost every security technology depends upon a Black List that identifies malicious IP addresses, device fingerprints, host names, domains, executables or email addresses, but naturally they come with generic, one-size-fits-all data. Dozens of startups now sell specialized threat intel, such as BVP-funded Internet Identity, which allows clusters of similar companies to pool their cyber intelligence, or BVP-funded iSight Partners, whose global field force of over 100 analysts track and profile cyber adversaries and how to spot them in your network. What better way for your analysts to investigate the most important incidents, than to prioritize the ones associated with your most formidable adversaries?

      "This is a global problem. We don't have a malware problem. We have an adversary problem. There are people being paid to try to get inside our systems 24/7"         
      - Tony Cole, FireEye VP on CNN

      And finally, security analysts need fewer alerts, not more. Instead of finding more anomalies, startups would better spend their time finding ways to eliminate alerts that don’t matter, and highlighting the ones that do. They would provide the analysts with better tools for connecting the alerts into incidents and campaigns, tapping into the skills of experienced “military grade” hackers to profile the attack patterns.

      Outlook

      The challenge of securing data today is obviously complex, with many other pressing opportunities for improvement such as cloud security, mobile security, application security and encryption. But as cyberwar spreads to the commercial Internet, re-orienting enterprise security to focus on Advanced Persistent Threats should be the single most important initiative for businesses and vendors alike. Of course, inertia is powerful, and it may take boards of directors, CISOs, product managers, entrepreneurs, and venture capitalists another tumultuous year in cyberspace to get the message.

                Dinosaurs in Space!        
      PCs and smartphones have pushed mainframes to the brink of extinction on Earth, and yet mainframes still thrive in space.

      Most every satellite in orbit is a floating dinosaur - a bloated, one-off, expensive, often militarized, monolithic relic of the mainframe era. The opportunity for entrepreneurs today is to launch modern computer networks into space, disrupting our aging infrastructure with an Internet of microsats. 

      Credit DeviantArt.com
      So why has it taken so long for modern computing to reach space? Gravity. It’s hard to launch things. Governments have the money and patience to do it, as do large cable and telecom corporations. These players are slow to innovate, and large satellites have met their basic needs around science, defense, and communications, albeit at very high costs.

      That’s changing:  several IT trends have come together to herald the extinction of these orbiting pterodactyls:
      • Moore’s law has reached the point where a single rocket launch can be amortized across dozens of tiny satellites, and the replacement cost is so low that we needn’t burden our missions with triple redundancies and a decade of testing
      • Global computing clouds make it easy to deploy ground stations; and
      • Advances in Big Data enable us to process the torrential flows of information we get from distributed networks

      These trends have reduced the cost of a single aerospace mission from a billion dollars down to a hundred million just as the early-stage VC community amassed enough capital to undertake projects of this scope. And now that a handful of venture-backed startups like SpaceX and Skybox are demonstrating success, the number of aerospace business plans circulating through Sand Hill Road has climbed faster than a Falcon 9.

      With each successful startup, progress accelerates and synergies emerge. As SpaceX makes launches cheaper, it opens the frontier to more entrepreneurs. Pioneers like Skybox and Planet Labs have to build end-to-end solutions for their markets, including everything from satellite buses to big data search algorithms; but there will soon evolve an ecosystem of vendors who specialize in launch mechanisms, cubesats, sensors, inter-sat communications, analytics, and software applications.

      So who are the customers for a space-based Internet? At first, aerospace startups will disrupt two large markets:

      ·       Scientific exploration of space.  In the past, costly scientific missions such as Apollo ($355 million in 1966), ISS ($3 billion/year), Hubble ($10 billion), and Cassini ($3.3 billion) were designed and built by government agencies. Expect startups to disrupt this market with innovations in rocketry, robotics, optics, cloud computing, space suits, renewable energy, and more.

      ·       Communications. Government defense agencies spend considerable sums on communications to serve their space-based weapon systems and intelligence bureaus. Media and cable companies also commission satellites to serve their consumers. Microsat networks of radios will supply these customers more cheaply and reliably.

      While spatial avionics improve with Moore’s Law, certainly some payloads, like telescopes and robots, cannot be miniaturized beyond the constraints of physics. But even these missions will benefit from the cheap, rapid testing available on a nanosatellite.  Just as programmers today can build entire software companies using a free A.W.S. account and the open source LAMP stack, space-faring entrepreneurs can now explore myriads of new business models by launching $1,000 cubesats out of ISS.


      In addition to disrupting existing markets, microsat networks in space will enable a new and important capability:  Planetary Awareness. When we surround our planet with sensors across the frequency spectrum, we will have access to data that opens up new markets. Today, we have sensors across our landmasses, but adding sensors in space, the ocean, and the atmosphere will illuminate both natural phenomena and human logistics. 

      Planetary Awareness will enable many capabilities of high social value:

      o   Aviation and maritime safety: The need for tracking and communicating with aircraft and ships is in the public eye today following the loss of flight MH370.

      o   Nature surveillance: Predict and monitor weather, global warming, natural disasters, and the risk of meteor damage (as pioneered by the B612 Foundation).

      o   Global journalism: Expose protests, genocides, and other state-censored events.

      Planetary Awareness will also open new markets of high economic value, which are much more likely to drive the success of aerospace startups:

      o   Finding natural resources: Minerals and fuel sources abound upon the ocean floor (as discovered by Liquid Robotics’ fleet of WaveGliders) and near-Earth asteroids (as Planetary Resources promises to find using cheap microsats).

      o   Financial services: Tracking human activity and commerce (e.g. the proverbial counting of cars in parking lots) yields valuable data to merchants, logistics providers and investors.

      o   Military and geopolitical intelligence: Governments already purchase imagery for this use, but visibility will greatly expand from more frequent flyovers, video, radio surveillance, and automated analytics.

      Geospatial imaging attracts many startups because it is already a robust and underserved market, but the opportunity to enable planetary awareness is much broader.  Dan Berkenstock didn’t start Skybox Imaging just to sell images and video: he had a more profound vision for the impact that startups can have on the aerospace industry.  His mission attracted co-founders from Stanford and NASA, his CEO Tom Ingersoll from Universal Space, aerospace legends like Joe Rothenberg who led the Hubble repair as well as other star engineers and investors. And now Skybox is proving that they, along with SpaceX and other nimble startups, will displace dinosaurs in space with data services driven by constellations of smart microsats. 


                The Coming Wave of Cloud Security Startups        
      This is a reprint of an article I wrote this week for MIT Technology Review.

      Our growing computer security problems will create many new companies.

      The threat from cyber-intrusions seems to have exploded in just the last 18 months. Mainstream media now report regularly on massive, targeted data breaches and on the digital skirmishes waged among nation states and cybermilitants.

      Unlike other looming technical problems that require innovation to address, cybersecurity never gets solved. The challenges of circuit miniaturization, graphical computing, database management, network routing, server virtualization, and similarly mammoth technical problems eventually wane as we tame their complexity. Cybersecurity is a never-ending Tom and Jerry cartoon. Like antibiotic-resistant bacteria, attackers adapt to our defenses and render them obsolete.

      As in most areas of IT and computing, innovation in security springs mostly from startup companies. Larger systems companies like Symantec, Microsoft, and Cisco contribute to the corpus of cybersecurity, but mostly acquire their new technologies from startups. Government agencies with sophisticated cyberskills tend to innovate more on the offensive side. I think that in the coming years we will see many small, creative teams of security engineers successfully discovering, testing, and building out clever new ways to secure cyberspace.

      Anyone looking to found or invest in one of those small security companies destined for success should focus on the tsunami of change rocking the IT world known as cloud computing. In a transformation that eclipses even the advent of client–server computing in the 1980s, business are choosing to subscribe to services in the cloud over running software on their own physical servers. Incumbents in every category of software are being disrupted by cloud-based upstarts. According to Forrester, the global market for cloud computing will grow more than sixfold this decade, to over a quarter trillion dollars.


      Cloud security, as it is known, is today one of the less mature areas of cloud computing, but it has already become clear that it will become a significant chunk of that vast new market. A Gartner report earlier this year predicted that the growth of cloud-based security services would overtake traditional security services in the next three years.

      Just like other software products, conventional security appliances are being replaced by cloud-based alternatives that are easier to deploy, cheaper to manage, and always up-to-date. Cloud-based security protections can also be more secure, since the vendor can correlate events and profile attacks across all of its customers’ networks. This collaborative capability will be critical in the coming years as the private sector looks to government agencies like the National Security Agency for protection from cyberattacks.

      The cloud also enables new security services based on so-called big data, which could simply not exist as standalone products. Companies like SumoLogic can harvest signals from around the Web for analysis, identifying attacks and attackers that couldn’t be detected using data from a single incident or source.

      These new data-centric, cloud-based security products are crucial to solving the challenges of keeping mobile devices secure. Most computers shipped today are mobile devices, and they make juicier targets than PCs because they have location and payment data, microphones, and cameras. But mobile carriers and employers cannot lock down phones and tablets completely because they are personal devices customized with personal apps. Worse, phones and tablets lack the processing power and battery life to run security processes as PCs do.

      Cloud approaches to security offer a solution. Software-as-a-service security companies like Zscaler can scan our mobile data traffic using proxies and VPNs, scrubbing them for malware, phishing, data leaks, and bots. In addition we see startups like Blue Cava, Iovation, and mSignia using Big Data to prevent fraud by fingerprinting mobile devices.

      Cloud security also involves protecting cloud infrastructure itself. New technologies are needed to secure the client data inside cloud-based services against theft or manipulation during transit or storage. Some security auditors and security companies already sell into this market, but most cloud developers, focused on strong customer growth, have been slow to deploy strong security. Eventually it should become possible for cloud computing customers to encrypt and destroy data using their own encryption keys. Until they do, there is an opportunity for startups such as CipherCloud and Vaultive to sell encryption technology that is used by companies over the top of their cloud services to encrypt the data inside.

      Lastly, cloud security also includes protecting against the cloud, which enables creative new classes of attack. For example, Amazon Web Services can be used for brute force attacks on cryptographic protocols, like that one German hacker used in 2010 to break the NSA’s Secure Hashing Algorithm. Attackers can use botnets and virtual servers to wage distributed denial of service attacks; and bots can bypass captcha defenses by crowdsourcing the answers. Cloud-based attacks demand innovative defenses that will likely come from startups. For example, Prolexic and Defense.net (a company Bessemer has invested in) operate networks of filters that buffer their clients from cloud-based DDOS attacks.

      Cloud computing may open up enormous vulnerabilities on the Internet, but it also presents great opportunity for innovative cybersecurity. In the coming decade, few areas of computing will be as attractive to entrepreneurs, technologists, and investors.
                Powerful Learning Experiences at the Intersection of Big Data and Big Empathy        


                Prism in a Big Data World        
      Debate around the Prism revelations is far-reaching, but how shocking are these developments?
                Primera victoria con 'inteligencia artificial' en la Copa América        
      El AC45 Land Rover BAR de Ben Ainslie renueva su triunfo en la regata Louis Vuitton Series en Portstmouth (Reino Unido). La embarcación emplea un sistema de 'Big Data' de la Fórmula 1 para competir. Leer
                Windows Azure – Write, Run or Use Software        

      Windows Azure is a platform that has you covered, whether you need to write software, run software that is already written, or Install and use “canned” software whether you or someone else wrote it. Like any platform, it’s a set of tools you can use where it makes sense to solve a problem.

      You can click on the graphic below for a larger picture of these components, or download a poster with more details here.

      The primary location for Windows Azure information is located at http://windowsazure.com. You can find everything there from the development kits for writing software to pricing, licensing and tutorials on all of that.

      I have a few links here for learning to use Windows Azure – although it’s best if you focus not on the tools, but what you want to solve. I’ve got it broken down here into various sections, so you can quickly locate things you want to know. I’ll include resources here from Microsoft and elsewhere – I use these same resources in the Architectural Design Sessions (ADS) I do with my clients worldwide.

      There is also a great video series on Cloud Fundamentals here, if you have some time to watch them. It's a great series that covers a lot of ground.

      Write Software

      Also called “Platform as a Service” (PaaS), Windows Azure has lots of components you can use together or separately that allow you to write software in .NET or various Open Source languages to work completely online, or in partnership with code you have on-premises or both – even if you’re using other cloud providers. Keep in mind that all of the features you see here can be used together, or independently. For instance, you might only use a Web Site, or use Storage, but you can use both together. You can access all of these components through standard REST API calls, or using our Software Development Kit’s API’s, which are a lot easier. In any case, you simply use Visual Studio, Eclipse, Cloud9 IDE, or even a text editor to write your code from a Mac, PC or Linux.

      Items Components you can use:

      linkAzure Web Sites: Windows Azure Web Sites allow you to quickly write an deploy websites, without setting a Virtual Machine, installing a web server or configuring complex settings. They work alone, with other Windows Azure Web Sites, or with other parts of Windows Azure. Read more about deciding to use Web Sites or Roles.

      linkWeb and Worker Roles: Windows Azure Web Roles give you a full stateless computing instance with Internet Information Services (IIS) installed and configured. Windows Azure Worker Roles give you a full stateless computing instance without Information Services (IIS) installed, often used in a "Services" mode. Scale-out is achieved either manually or programmatically under your control.

      linkStorage: Windows Azure Storage types include Blobs to store raw binary data, Tables to use key/value pair data (like NoSQL data structures), Queues that allow interaction between stateless roles, and a relational SQL Server database.

      linkOther Services: Windows Azure has many other services such as a security mechanism, a Cache (memcacheD compliant), a Service Bus, a Traffic Manager and more. Once again, these features can be used with a Windows Azure project, or alone based on your needs.

      linkWindows Azure Mobile Services: A simple framework service which enables you to quickly develop the back-end for mobile services. For the front-end, check out the iOS SDK, news about the Android SDK, and the Windows Phone SDK.  

       

      linkVarious Languages: Windows Azure supports the .NET stack of languages, as well as many Open-Source languages like Java, Python, PHP, Ruby, NodeJS, C++ and more.

       

      Use Software

      Also called “Software as a Service” (SaaS) this often means consumer or business-level software like Hotmail or Office 365. In other words, you simply log on, use the software, and log off – there’s nothing to install, and little to even configure. For the Information Technology professional, however, It’s not quite the same. We want software that provides services, but in a platform. That means we want things like Hadoop or other software we don’t want to have to install and configure.

      Items Components you can use:

      linkKits: Various software “kits” or packages are supported with just a few clicks, such as Umbraco, Wordpress, and others.

      linkWindows Azure Media Services: Windows Azure Media Services is a suite of services that allows you to upload media for encoding, processing and even streaming – or even one or more of those functions. We can add DRM and even commercials to your media if you like. Windows Azure Media Services is used to stream large events all the way down to small training videos.

      linkHigh Performance Computing and “Big Data”: Windows Azure allows you to scale to huge workloads using a few clicks to deploy Hadoop Clusters or the High Performance Computing (HPC) nodes, accepting HPC Jobs, Pig and Hive Jobs, and even interfacing with Microsoft Excel.

      linkWindows Azure Marketplace: Windows Azure Marketplace offers data and programs you can quickly implement and use – some free, some for-fee.

       

      Run Software

      Also known as “Infrastructure as a Service” (IaaS), this offering allows you to build or simply choose a Virtual Machine to run server-based software.

      Items Components you can use:

      linkPersistent Virtual Machines: You can choose to install Windows Server, Windows Server with Active Directory, with SQL Server, or even SharePoint from a pre-configured gallery. You can configure your own server images with standard Hyper-V technology and load them yourselves – and even bring them back when you’re done. As a new offering, we also even allow you to select various distributions of Linux – a first for Microsoft.

      linkWindows Azure Connect: You can connect your on-premises networks to Windows Azure Instances.

      linkStorage: Windows Azure Storage can be used as a remote backup, a hybrid storage location and more using software or even hardware appliances.

       

      Decision Matrix

      toolWith all of these options, you can use Windows Azure to solve just about any computing problem. It’s often hard to know when to use something on-premises, in the cloud, and what kind of service to use.

      I’ve used a decision matrix in the last couple of years to take a particular problem and choose the proper technology to solve it. It’s all about options – there is no “silver bullet”, whether that’s Windows Azure or any other set of functions. I take the problem, decide which particular component I want to own and control – and choose the column that has that box darkened. For instance, if I have to control the wiring for a solution (a requirement in some military and government installations), that means the “Networking” component needs to be dark, and so I select the “On Premises” column for that particular solution. If I just need the solution provided and I want no control at all, I can look as “Software as a Service” solutions.

      image

      Training Resources

       linklinkMicrosoft Virtual Academy: Free video-based training.

       linkWindows Azure Documentation: Official documentation for the product.

       

      Security, Pricing, and Other Info

       linkSecurity: Security is one of the first questions you should ask in any distributed computing environment. We have certification info, coding guidelines and more, even a general “Request for Information” RFI Response already created for you

       linkPricing: Are there licenses? How much does this cost? Is there a way to estimate the costs in this new environment?

      linkNew Features: Many new features were added to Windows Azure - and you can keep up to date with community information released monthly here: http://blogs.msdn.com/b/davidmcg/

      link

      Windows Azure Cookbooks: Great resource for architecture solutions - http://www.notsotrivial.net/blog/category/Architecture.aspx

      link

      Support: Software Support on Virtual Machinesgeneral support, support plans

      link

      Hands-On Labs: http://msdn.microsoft.com/en-us/jj618399

      linkWindows Azure Capability Discussion Presentation and Windows Azure Solution Implementer Guide and Windows Azure Business Priorities Guide

       

       


                Big Data - A Microsoft Tools Approach        

      (As with all of these types of posts, check the date of the latest update I’ve made here. Anything older than 6 months is probably out of date, given the speed with which we release new features into Windows and SQL Azure)

      I don’t normally like to discuss things in terms of tools. I find that whenever you start with a given tool (or even a tool stack) it’s too easy to fit the problem to the tool(s), rather than the other way around as it should be.

      That being said, it’s often useful to have an example to work through to better understand a concept. But like many ideas in Computer Science, “Big Data” is too broad a term in use to show a single example that brings out the multiple processes, use-cases and patterns you can use it for.

      So we turn to a description of the tools you can use to analyze large data sets. “Big Data” is a term used lately to describe data sets that have the “Four V’s”  as a characteristic, but I have a simpler definition I like to use:

      Big Data involves a data set too large to process in a reasonable period of time

      I realize that’s a bit broad, but in my mind it answers the question and is fairly future-proof. The general idea is that you want to analyze some data, and using whatever current methods, storage, compute and so on that you have at hand it doesn’t allow you to finish processing it in a time period that you are comfortable with. I’ll explain some new tools you can use for this processing.

      Yes, this post is Microsoft-centric. There are probably posts from other vendors and open-source that cover this process in the way they best see fit. And of course you can always “mix and match”, meaning using Microsoft for one or more parts of the process and other vendors or open-source for another. I never advise that you use any one vendor blindly - educate yourself, examine the facts, perform some tests and choose whatever mix of technologies best solves your problem.

      At the risk of being vendor-specific, and probably incomplete, I use the following short list of tools Microsoft has for working with “Big Data”. There is no single package that performs all phases of analysis. These tools are what I use; they should not be taken as a Microsoft authoritative testament to the toolset we’ll finalize for a given problem-space. In fact, that’s the key: find the problem and then fit the tools to that.

      Process Types

      I break up the analysis of the data into two process types. The first is examining and processing the data in-line, meaning as the data passes through some process. The second is a store-analyze-present process.

      Processing Data In-Line

      Processing data in-line means that the data doesn’t have a destination - it remains in the source system. But as it moves from an input or is routed to storage within the source system, various methods are available to examine the data as it passes, and either trigger some action or create some analysis.

      You might not think of this as “Big Data”, but in fact it can be. Organizations have huge amounts of data stored in multiple systems. Many times the data from these systems do not end up in a database for evaluation. There are options, however, to evaluate that data real-time and either act on the data or perhaps copy or stream it to another process for evaluation.

      The advantage of an in-stream data analysis is that you don’t necessarily have to store the data again to work with it. That’s also a disadvantage - depending on how you architect the solution, you might not retain a historical record. One method of dealing with this requirement is to trigger a rollup collection or a more detailed collection based on the event.

      StreamInsight - StreamInsight is Microsoft’s “Complex Event Processing” or CEP engine. This product, hooked into SQL Server 2008R2, has multiple ways of interacting with a data flow. You can create adapters to talk with systems, and then examine the data mid-stream and create triggers to do something with it. You can read more about StreamInsight here: http://msdn.microsoft.com/en-us/library/ee391416(v=sql.110).aspx 

      BizTalk - When there is more latency available between the initiation of the data and its processing, you can use Microsoft BizTalk. This is a message-passing and Service Bus oriented tool, and it can also be used to join system’s data together than normally does not have a direct link, for instance a Mainframe system to SQL Server. You can learn more about BizTalk here: http://www.microsoft.com/biztalk/en/us/overview.aspx 

      .NET and the Windows Azure Service Bus - Along the same lines as BizTalk but with a more programming-oriented design are the Windows and Windows Azure Service Bus tools. The Service Bus allows you to pass messages as well, and opens up web interactions and even inter-company routing. BizTalk can do this as well, but the Service Bus tools use an API approach for designing the flow and interfaces you want. The Service Bus offerings are also intended as near real-time, not as a streaming interface. You can learn more about the Windows Azure Service Bus here: http://www.windowsazure.com/en-us/home/tour/service-bus/ and more about the Event Processing side here: http://msdn.microsoft.com/en-us/magazine/dd569756.aspx 

      Store-Analyze-Present

      A more traditional approach with an organization’s data is to store the data and analyze it out-of-band. This began with simply running code over a data store, but as locking and blocking became an issue on a file system, Relational Database Management Systems (RDBMs) were created. Over time a distinction was made between data used in an online processing system, meant to be highly available for writing data (OLTP) and systems designed for analytical and reporting purposes (OLAP).

      Later the data grew larger than these systems were designed for, primarily due to consistency requirements. In analysis, however, consistency isn’t always a requirement, and so file-based systems for that analysis were re-introduced from the Mainframe concepts, with new technology layered in for speed and size.

      I normally break up the process of analyzing large data sets into four phases:

      1. Source and Transfer - Obtaining the data at its source and transferring or loading it into the storage; optionally transforming it along the way
      2. Store and Process - Data is stored on some sort of persistence, and in some cases an engine handles the acquisition and placement on persistent storage, as well as retrieval through an interface.
      3.  Analysis - A new layer introduced with “Big Data” is a separate analysis step. This is dependent on the engine or storage methodology, is often programming language or script based, and sometimes re-introduces the analysis back into the data. Some engines and processes combine this function into the previous phase.
      4. Presentation - In most cases, the data wants a graphical representation to comprehend, especially in a series or trend analysis. In other cases a simple symbolic representation, similar to the “dashboard” elements in a Business Intelligence suite. Presentation tools may also have an analysis or refinement capability to allow end-users to work with the data sets. As in the Analysis phase, some methodologies bundle in the Analysis and Presentation phases into one toolset.

      Source and Transfer

      You’ll notice in this area, along with those that follow, Microsoft is adopting not only its own technologies but those within open-source. This is a positive sign, and means that you will have a best-of-breed, supported set of tools to move the data from one location to another. Traditional file-copy, File Transfer Protocol and more are certainly options, but do not normally deal with moving datasets.

      I’ve already mentioned the ability of a streaming tool to push data into a store-analyze-present model, so I’ll follow up that discussion with the tools that can extract data from one source and place it in another.

      SQL Server Integration Services (SSIS)/SQL Server Bulk Copy Program (BCP) - SSIS is a SQL Server tool used to move data from one location to another, and optionally perform transform or other processes as it does so. You are not limited to working with SQL Server data - in fact, almost any modern source of data from text to various database platforms is available to move to various systems. It is also extremely fast and has a rich development environment. You can learn more about SSIS here: http://msdn.microsoft.com/en-us/library/ms141026.aspx BCP is a tool that has been used with SQL Server data since the first releases; it has multiple sources and destinations as well. It is a command-line utility,and has some limited transform capabilities. You can learn more about BCP here: http://msdn.microsoft.com/en-us/library/ms162802.aspx 

      Sqoop - Tied to Microsoft’s latest announcements with Hadoop on Windows and Windows Azure, Sqoop is a tool that is used to move data between SQL Server 2008R2 (and higher) and Hadoop, quickly and efficiently. You can read more about that in the Readme file here: http://www.microsoft.com/download/en/details.aspx?id=27584 

      Application Programming Interfaces - API’s exist in most every major language that can connect to one data source, access data, optionally transforming it and storing it in another system. Most every dialect of  the .NET-based languages contain methods to perform this task.

      Store and Process

      Data at rest is normally used for historical analysis. In some cases this analysis is performed near real-time, and in others historical data is analyzed periodically. Systems that handle data at rest range from simple storage to active management engines.

      SQL Server - Microsoft’s flagship RDBMS can indeed store massive amounts of complex data. I am familiar with a two systems in excess of 300 Terabytes of federated data, and the Pan-Starrs project is designed to handle 1+ Petabyte of data. The theoretical limit of SQL Server DataCenter edition is 540 Petabytes. SQL Server is an engine, so the data access and storage is handled in an abstract layer that also handles concurrency for ACID properties. You can learn more about SQL Server here: http://www.microsoft.com/sqlserver/en/us/product-info/compare.aspx 

      SQL Azure Federations - SQL Azure is a database service from Microsoft associated with the Windows Azure platform. Database Servers are multi-tenant, but are shared across a “fabric” that moves active databases for redundancy and performance. Copies of all databases are kept triple-redundant with a consistent commitment model. Databases are (at this writing - check http://WindowsAzure.com for the latest) capped at a 150 GB size limit per database. However, Microsoft released a “Federation” technology, allowing you to query a head node and have the data federated out to multiple databases. This improves both size and performance. You can read more about SQL Azure Federations here: http://social.technet.microsoft.com/wiki/contents/articles/2281.federations-building-scalable-elastic-and-multi-tenant-database-solutions-with-sql-azure.aspx 

      Analysis Services - The Business Intelligence engine within SQL Server, called Analysis Services, can also handle extremely large data systems. In addition to traditional BI data store layouts (ROLAP, MOLAP and HOLAP), the latest version of SQL Server introduces the Vertipaq column-storage technology allowing more direct access to data and a different level of compression. You can read more about Analysis Services here: http://www.microsoft.com/sqlserver/en/us/solutions-technologies/business-intelligence/analysis-services.aspx and more about Vertipaq here: http://msdn.microsoft.com/en-us/library/hh212945(v=SQL.110).aspx

      Parallel Data Warehouse - The Parallel Data Warehouse (PDW) offering from Microsoft is largely described by the title. Accessed in multiple ways including using Transact-SQL (the Microsoft dialect of the Structured Query Language), This is an MPP appliance scaling in parallel to extremely large datasets. It is a hardware and software offering - you can learn more about it here: http://www.microsoft.com/sqlserver/en/us/solutions-technologies/data-warehousing/pdw.aspx

      HPC Server - Microsoft’s High-Performance Computing version of Windows Server deals not only with large data sets, but with extremely complicated computing requirements. A scale-out architecture and inter-operation with Linux systems, as well as dozens of applications pre-written to work with this server make this a capable “Big Data” system. It is a mature offering, with a long track record of success in scientific, financial and other areas of data processing. It is available both on premises and in Windows Azure, and also in a hybrid of both models, allowing you to “rent” a super-computer when needed. You can read more about it here: http://www.microsoft.com/hpc/en/us/product/cluster-computing.aspx 

      Hadoop - Pairing up with Hortonworks, Microsoft has released the Hadoop Open-Source system -  including HDFS and a Map/Reduce standardized software, Hive and Pig - on Windows and the Windows Azure platform. This is not a customized version; off-the-shelf concepts and queries work well here. You can read more about Hadoop here: http://hadoop.apache.org/common/docs/current/ and you can read more about Microsoft’s offerings here: http://hortonworks.com/partners/microsoft/ and here: http://social.technet.microsoft.com/wiki/contents/articles/6204.hadoop-based-services-for-windows.aspx

      Windows and Azure Storage - Although not an engine - other than a triple-redundant, immediately consistent commit - Windows Azure can hold terabytes of information and make it available to everything from the R programming language to the Hadoop offering. Binary storage (Blobs) and Table storage (Key-Value Pair) data can be queried across a distributed environment. You can learn more about Windows Azure storage here: http://msdn.microsoft.com/en-us/library/windowsazure/gg433040.aspx 

      Analysis

      In a “Big Data” environment, it’s not unusual to have a specialized set of tasks for analyzing and even interpreting the data. This is a new field called “data Science”, with a requirement not only for computing, but also a heavy emphasis on math.

      Transact-SQL - T-SQL is the dialect of the Structured Query Language used by Microsoft. It includes not only robust selection, updating and manipulating of data, but also analytical and domain-level interrogation as well. It can be used on SQL Server, PDW and ODBC data sources. You can read more about T-SQL here: http://msdn.microsoft.com/en-us/library/bb510741.aspx 

      Multidimensional Expressions and Data Analysis Expressions - The MDX and DAX languages allow you to query multidimensional data models that do not fit well with typical two-plane query languages. Pivots, aggregations and more are available within these constructs to query and work with data in Analysis Services. You can read more about MDX here: http://msdn.microsoft.com/en-us/library/ms145506(v=sql.110).aspx and more about DAX here: http://www.microsoft.com/download/en/details.aspx?id=28572 

      HPC Jobs and Tasks - Work submitted to the Windows HPC Server has a particular job - essentially a reservation request for resources. Within a job you can submit tasks, such as parametric sweeps and more. You can learn more about Jobs and Tasks here: http://technet.microsoft.com/en-us/library/cc719020(v=ws.10).aspx 

      HiveQL - HiveQL is the language used to query a Hive object running on Hadoop. You can see a tutorial on that process here: http://social.technet.microsoft.com/wiki/contents/articles/6628.aspx 

      Piglatin - Piglatin is the submission language for the Pig implementation on Hadoop. An example of that process is here: http://blogs.msdn.com/b/avkashchauhan/archive/2012/01/10/running-apache-pig-pig-latin-at-apache-hadoop-on-windows-azure.aspx 

      Application Programming Interfaces - Almost all of the analysis offerings have associated API’s - of special note is Microsoft Research’s Infer.NET, a new language construct for framework for running Bayesian inference in graphical models, as well as probabilistic programming. You can read more about Infer.NET here: http://research.microsoft.com/en-us/um/cambridge/projects/infernet/ 

      Presentation

      Lots of tools work in presenting the data once you have done the primary analysis. In fact, there’s a great video of a comparison of various tools here: http://msbiacademy.com/Lesson.aspx?id=73 Primarily focused on Business Intelligence. That term itself is now not as completely defined, but the tools I’ll show below can be used in multiple ways - not just traditional Business Intelligence scenarios. Application Programming Interfaces (API’s) can also be used for presentation; but I’ll focus here on “out of the box” tools.

      Excel - Microsoft’s Excel can be used not only for single-desk analysis of data sets, but with larger datasets as well. It has interfaces into SQL Server, Analysis Services, can be connected to the PDW, and is a first-class job submission system for the Windows HPC Server. You can watch a video about Excel and big data here: http://www.microsoft.com/en-us/showcase/details.aspx?uuid=e20b7482-11c9-4965-b8f0-7fb6ac7a769f and you can also connect Excel to Hadoop: http://social.technet.microsoft.com/wiki/contents/articles/how-to-connect-excel-to-hadoop-on-azure-via-hiveodbc.aspx

      Reporting Services - Reporting Services is a SQL Server tool that can query and show data from multiple sources, all at once. It can also be used with Analysis Services. You can read more about Reporting Services here: http://www.microsoft.com/sqlserver/en/us/solutions-technologies/business-intelligence/reporting-services.aspx 

      Power View - Power View is a “Self-Service” Business Intelligence reporting tool, which can work with on-premises data in addition to SQL Azure and other data. You can read more about it and see videos of Power View in action here: http://www.microsoft.com/sqlserver/en/us/future-editions/business-intelligence/SQL-Server-2012-reporting-services.aspx 

      SharePoint Services - Microsoft has rolled several capable tools in SharePoint as “Services”. This has the advantage of being able to integrate into the working environment of many companies. You can read more about  lots of these reporting and analytic presentation tools here: http://technet.microsoft.com/en-us/sharepoint/ee692578 

      This is by no means an exhaustive list - more capabilities are added all the time to Microsoft’s products, and things will surely shift and merge as time goes on. Expect today’s “Big Data” to be tomorrow’s “Laptop Environment”.


                Big Data – infrastructure DBA – MongoDB and cloud environment        
      A Medical imaging SAAS company is seeking for a talented DBA (infrastructure). You will work with big data, Scale the company's data set to millions of samples, perform large scale data analysis and research, handle performance, scale, availability, accuracy and monitoring.
                Digital Production BuZZ - July 24, 2014        

      Join Larry Jordan and co-host Michael Horton as they talk with:

      Zack Arnold, Editor/Director

      Zack Arnold has been a professional editor for 15 years, most recently on the TV series “Burn Notice.” He made his directorial debut on the documentary: “GO FAR: The Christopher Rush Story.” Zack’s latest venture is founding “Fitness In Post,” an online resource and community built specifically for people in the post-production industry who want to live a healthier lifestyle but don’t know where to start. This week, he tells us more about it.

      Jonathan Handel, Entertainment/Technology Attorney & Labor Reporter, TroyGould and The Hollywood Reporter

      SAG/AFTRA is claiming “significant gains” in the new contract just ratified. Jonathan Handel, Entertainment/Technology Attorney and labor reporter for “The Hollywood Reporter” gives us the details.

      Marty Lafferty, CEO, DCIA (Distributed Computing Industry Association)

      The DCIA (Distributed Computing Industry Association) is an international trade organization focused on commercial advancement of cloud computing and related technologies, particularly as they are deployed for the delivery of high-value content. Marty Lafferty, CEO of DCIA, joins us this week to explain how big data, globalization and cloud computing is dramatically changing telecommunications.


                DATA DRY SPELL: The maritime industry has plenty of data, but few owners use it to improve operations        
      With nearly 91,000 vessels, the global maritime industry crosses social, economic and geographic frontiers, but it has not yet crossed the data boundary by embracing big data. With connectivity options and speeds improving, ships are beginning to join a data revolution that promises efficiency and cost savings. However, the question remains: what will be the …

      Continue reading »

                Lior Strahilevitz, "Personalizing Default Rules and Disclosure with Big Data"        
      The laws of intestacy are the same for men and women even though preferences for how one's estate should be divided differ by gender. Peanut-allergic octogenarian men and gluten-allergic pregnant women see the same warnings on consumer products even though they are interested in seeing information that is much better tailored to them. Companies have made enormous strides in studying and classifying groups of consumers, and yet almost none of this information is put to use by providing consumers with contractual default terms or disclosures that are tailored to their preferences and attributes. This lecture will explore the costs and benefits of personalizing various parts of American law and business practices. This talk was recorded on April 7, 2014. Lior Strahilevitz is Sidley Austin Professor of Law at the University of Chicago Law School.
                La inversión empresarial en TI en España descenderá un 0,6% en 2017 y se estancará hasta 2020, según IDC        

      El gasto se disparará más de un 20% en las partidas de cloud público y Big Data


                Discover 7 new Microsoft MCSA and MCSE certifications         
      Microsoft have announced the launch of 6 new MCSA certifications and 1 new MCSE certification. This demonstrates Microsoft’s commitment to a growing Azure, Big Data, Business Intelligence (BI) and Dynamics community.

      These new certifications and courses will support Microsoft partners looking to upskill and validate knowledge in these technologies.  

      Following the huge changes announced in September, these new launches will simplify your path to certification. They'll minimise the number of steps required to earn a certification, while allowing you to align your skills to industry-recognised areas of competence.

      This blog will outline the new certifications Microsoft have announced, focusing on the technologies, skills and job roles they align to. 

      So what's new?


      MCSA: Microsoft Dynamics 365

      This MCSA: Microsoft Dynamics 365 certification is one of three Dynamics 365 certifications launched. It demonstrates your expertise in upgrading, configuring and customising the new Microsoft Dynamics 365 platform.

      There are currently no MOCs aligned to this certification. We have developed our own Firebrand material that will prepare you for the following two exams needed to achieve this certification:
      • MB2-715: Microsoft Dynamics 365 customer engagement Online Deployment
      • MB2-716: Microsoft Dynamics 365 Customization and Configuration 
      This certification will validate you have the skills for a position as a Dynamics 365 developer, implementation consultant, technical support engineer or system administrator.

      This certification is a prerequisite for the MCSE: Business Applications. 

      MCSA: Microsoft Dynamics 365 for Operations

      The second of these three Dynamics 365 certs is the MCSA: Microsoft Dynamics 365 for Operations. Here, you’ll get the skills to manage a Microsoft SQL Server database and customise Microsoft Dynamics 365.

      On this course, you’ll cover the following MOC:
      • 20764: Administering a SQL Database Infrastructure 
      The second part of this course, of which there is currently no MOC, will cover Firebrand's own material. 

      To achieve this certification you’ll need to pass the following exams:
      • 70-764: Administering a SQL Database Infrastructure
      • MB6-890: Microsoft Development AX Development Introduction 
      Earning this cert proves you have the technical competence for positions such as Dynamics 365 developer, solutions architect or implementer.  

      Just like the MCSA: Microsoft Dynamics 365, this certification is also a prerequisite to the new MCSE: Business Applications certification. 

      MCSE: Business Applications

      Earning an MCSE certification validates a more advanced level of knowledge. The MCSE: Business Applications certification proves an expert-level competence in installing, operating and managing Microsoft Dynamics 365 technologies in an enterprise environment.

      In order to achieve this certification you’ll be required to pass either the MCSA: Microsoft Dynamics 365 or the MCSA: Microsoft Dynamics 365 for Operations. You’ll also be required to choose one of the following electives to demonstrate expertise on a business-specific area:
      • MB2-717: Microsoft Dynamics 365 for Sales
      • MB2-718: Microsoft Dynamics 365 for Customer Service
      • MB6-892: Microsoft Dynamics AX - Distribution and Trade
      • MB6-893: Microsoft Dynamics AX - Financials  
      Earning your MCSE: Business Applications certification will qualify you for the roles such as Dynamics 365 developer, implementation consultant, technical support engineer, or system administrator.

      MCSA: Big Data Engineering

      This MCSA: Big Data Engineering certification demonstrates you have the skills to design and implement big data engineering workflows with the Microsoft cloud ecosystem and Microsoft HD Insight to extract strategic value from your data.

      On this course you’ll cover the following MOCs:
      • 20775A: Perform Data Engineering on Microsoft HDInsight – expected 28/6/2017
      • 20776A: Engineering Data with Microsoft Cloud Services – expected 08/2017
      And take the following exams:
      • 70-775: Perform Data Engineering on Microsoft HD Insight – available now in beta
      • 70-776: Engineering Data with Microsoft Cloud Services – expected Q1 2018
      This course is aimed at data engineers, data architects, data scientists and data developers.

      Earning this MCSA acts as a prerequisite, and your first step, to achieving the MCSE: Data Management and Analytics credential.

      MCSA: BI Reporting

      This MCSA: BI Reporting certification proves your understanding of data analysis using Power BI. You’ll learn the skills to create and manage enterprise business intelligence solutions.

      The MOCs you’ll cover on this course include:
      • 20778A: Analyzing Data with Power BI
      • 20768B: Developing SQL Data Models 
      In order to achieve the certification, you’ll take the following exams:
      • 70-778: Analyzing Data with Power BI - expected Q1 2018
      • 70-768: Developing SQL Data Models 
      This certification is aimed at database professionals needing to create enterprise BI solutions and present data using alternative methods.

      This certification is a prerequisite for the MCSE: Data Management and Analytics credential. 

      MCSA: Cloud Database Development 

      This MCSA: Cloud Database Development certification will prove you have the skills to build and implement NoSQL solutions with DocumentDB and Azure Search for the Azure data platform

      This certification covers the following MOCs:
      • 40441: Designing and Implementing Cloud Data Platform Solutions
      • 20777: Implementing NoSQL Solutions with DocumentDB and Azure Search – expected in August 2017 
      In order to achieve the certification, you'll have to pass the following exams: 
      • 70-473: Designing and Implementing Cloud Data Platform Solutions
      • 70-777: Implementing NoSQL Solutions with DocumentDB and Azure Search – expected in Q1 2018
      This course is aimed at specialist professionals looking to validate their skills and knowledge of developing NoSQL solutions for the Azure data platform. 

      This certification is also a prerequisite certification to the MCSE: Data Management and Analytics credential. 

      MCSA: Data Science

      This course will teach you the skills in operationalising Microsoft Azure machine learning and Big Data with R Server and SQL R Services. You'll learn to process and analyse large data sets using R and use Azure cloud services to build and deploy intelligent solutions.

      This certification covers the following MOCs:
      • 20773A: Analyzing Big Data with Microsoft R – in development, expected May 2017
      • 20774A: Perform Cloud Data Science with Azure Machine Learning – in development, expected June 2017
      To achieve this certification you’ll be required to pass the following exams:
      • 70-773: Analyzing Big Data with Microsoft R – available now in beta
      • 70-774: Perform Cloud Data Science with Azure Machine Learning – available now in beta 
      This certification, which is your first step to the MCSE: Data Management and Analytics cert is best suited to data science or data analyst job roles. 


                Common Challenges of Big Data and How to Overcome Them        
      Big Data ⯀  Big data offers a lot of opportunities to the few companies who use them. However, one main reason why a larger percentage of the corporate world is yet to embrace big data is because of...

      [[ This is a content summary only. Visit my website for full links, other content, and more! ]]

                E&P Connect Exclusive: Effective Completions        
      Austin, Texas-based software firm Ayata reviews how big data analytics leads to high density completion techniques.
                Maximizing Oil Production with Prescriptive Analytics        
      The transformative promise of Big Data Analytics is to generate actionable insights from massive amounts of constantly evolving data, and to then leverage those insights to achieve positive, meaningful business and societal outcomes.  Listen as Ayata’s SVP of Sales and Marketing, Daniel Mohan, discusses how Ayata is helping top-performing operators in unconventional plays frustrated by
                President Obama        

      President Obama ---The Man/The Icon



                      David J. Garrow's  Rising Star: The Making of Barack Obama  (New York: William Morrow, 2017) is a big book.  Its ten chapters of narrative occupy 1078 pages; the remaining 383 pages consist of the acknowledgement (1079-1084), the copious chapter notes (1085-1356), the bibliography (1357-1391), the index (1393-1460) and the "About the Author" page (1461).  Are so many pages needed to cover the life of Barack Hussein Obama II from August 4, 1961 to January 19, 2017?  Yes.  Do so many pages adequately provide full disclosure of Obama's rise as our most noteworthy Kenyan American and 44th President?  No.  A single book  can't possibly give us all the contextualized facts we either need to know or think we need.  A trenchant analysis of anything in our everyday lives, especially of major figures and events in American politics,  requires a crunching of big data and the writing of persuasive narratives.  Rising Star is Garrow's effort to make a compelling statement about our rage for social, political, and cultural information.  His success, however, compounds the difficulty of knowing what is truly necessary and sufficient.



                      Reading Rising Star cover to cover is probably not the path many readers will take.  They will sample chapters and depend on the index to guide them to topics which seem to be of immediate relevance.  Unlike their nineteenth-century ancestors, most contemporary readers lack the patience and discipline to engage a big book ---unless the book pertains directly to a job, career advancement or retrofitting, and a paycheck.  Even for readers who work in the arena of politics, policy decisions may be of greater importance than expanding their sense of history.  Rising Star will be relegated to a shelf of reference books and consulted only when a search engine doesn't provide immediate access to specialized information or "factoids" about President Obama and his eight years in office.



                      We can anticipate that Rising Star will eventually appear on the collateral reading lists for advanced graduate courses in American government, political theory, historiography, or  the politics of race.  Special, limited audiences of teachers and students will explore Garrow's artistry in aligning snapshots of Obama the man (organic human being) with formal photographs of Obama the president (the fashioned or constructed political being).  They will be positioned to make sense of Garrow's pragmatic  coup de grâce :



      In Springfield too a perceptive woman understood how Barack "is an invention of himself."  But it was essential  to appreciate that while the crucible of self-creation had produced an ironclad will, the vessel was hollow at its core. "You didn't let anyone sneak up behind you to see emotions --like hurt or fear ---you didn't want them to see," Barack long ago had taught himself, yet hand in hand with that resolute self-discipline came a profound emptiness. (1078) [my italics]



      Irony of irony that what is imagined to be hollow and empty will in time be seen to be solid and full. We shall need yet another 1461 pages to begin to understand the quintessential American irony that Garrow invites us to ponder.



      Jerry W. Ward, Jr.                            July 2, 2017



                Ubiquity in the future of Banking: our experience at Money 20/20 Europe        

      /// Interesting experience for Ubiquity at  Money 20/20 Europe event in Copenhagen. Money 20/20 Europe, the most important fintech event in Europe, has been designed to bring together all stakeholders with a role to play in the trade revolution: payment and financial services providers, banks, the mobile ecosystem, the retail industry, marketing services, big data … Continue reading Ubiquity in the future of Banking: our experience at Money 20/20 Europe

      L'articolo Ubiquity in the future of Banking: our experience at Money 20/20 Europe sembra essere il primo su Ubiquity.


                What's the "Optimal" Failure Rate at Netflix?        
      When Orange is the New Black, House of Cards, and Crown became mega-hits for Netflix, many people credited the analytics capabilities of the company. Mining the customer data had enabled the firm to project the type of original programming that would be highly successful. By this logic, Netflix would achieve a lower failure rate on new shows than the major television networks. After all, broadcasters such as CBS and NBC cancel a substantial share of their new shows each year, some after only a few episodes. 

      On the recent Netflix earnings call, many investors were pleased to hear about strong subscriber growth at the firm. However, some investors came away concerned about the amount of spending taking place as the firm acquires or develops new content. Moreover, some observers and analysts have expressed concern about the recent cancellations of some new Netflix original shows. Tom Huddleston Jr. reported on the company's reaction to this criticism in a recent Fortune article:

      Meanwhile, also on the Monday earnings call, Netflix's chief content officer Ted Sarandos defended the company's recent cancellations of a handful of expensive, but underperforming, original series. "The more shows we have, the more likely in absolute numbers that you’ll see cancellations, of course," Sarandos said. The executive compared Netflix's recent spate of cancellations—including big-budget series like The Get Down and Sense8—to traditional TV networks that cancel nearly one-third of their new shows after their first seasons. Netflix, he said, has renewed 93% of its original series. With respect to the shows that Netflix opted not to renew, Sarandos argued: "If you’re not failing, maybe you’re not trying hard enough."

      This quote from Sarandos raises a fascinating question.  What is the "optimal" failure rate at Netflix?  Surely, we would like the failure rate to be lower than the broadcast networks.  We would like to see the company reaping the benefits of its analytics capabilities.  At the same time, no one should want Netflix's failure rate on original programming to be zero.  We want the firm to take some chances in hopes of landing some surprising breakthrough hits.  Hopefully, the firm isn't simply guessing or drawing on the intuition of the "creatives" in the business.  We would like to see them engaging in "enlightened" experimentation, using big data to guide them while still taking some risks.   If they balance data mining and risk-taking in an effective way, the failure rate won't be zero, but it will be much lower than their broadcast and cable competitors.  

                Comment on Big Data and The Great A.I. Awakening. Interview with Steve Lohr by Giovanni Panzeri        
      “AI is akin to building a rocket ship. You need a huge engine and a lot of fuel. The rocket engine is the learning algorithms but the fuel is the huge amounts of data we can feed to these algorithms.” by Andrew Ng
                Comment on Big Data: Three questions to Aerospike. by Brian Bulkowski        
      Brian here again --- We've released our Aerospike Hadoop integration, that allows you to easily use Aerospike as your Hadoop datastore. It's location-aware, so you can run MapReduce without network traffic, or you can real-time emit to Aerospike to use your insights in applications _immediately_. http://github.com/aerospike/aerospike-hadoop
                Soluto Impresses at TechCrunch Disrupt        

      Two days after launching the beta version of their first product, Soluto was anointed the winner of the hyper competitive start-up battle at TechCrunch Disrupt yesterday in New York. This is a prestigious accolade and an ideal launch pad especially for a company like Soluto which needs a large user base to perfect its product. To be clear, this was not any start-up competition, or simply the latest crop of Web start-ups, but Web start-ups that have the potential to be disruptive. I know it’s a loaded word that many will contest until they are blue in the face, but simply adding this modifier, made this more interesting and challenging than usual. I applaud TechCrunch for putting on the event, and am thrilled that my own portfolio company would take home the victory cup. So congratulations to Tomer, Ishay, Roee and the rest of the hard working Soluto team!



      And for those of you who don’t already know, Soluto is developing anti-frustration software. This download and accompanying service aims to lessen, if not eliminate, the frustration PC users feel when they twiddle their thumbs waiting for their computer to boot, staring at a frozen mouse cursor or rotating hourglass, or screaming in anguish when an application suddenly crashes on them.

      I invested in Soluto foremost because of the strong entrepreneurs, who exhibit that rare combination technology depth and aptitude for consumer products. However, beyond the team and market potential, Soluto had a particular resonance with me because of their vision and approach. It fit squarely with my own investment roadmap around companies that leverage technology and their user base to create innovative web-based services for consumers.

      My favorite motif within this “technology-enabled, crowd source-enhanced web service” investment roadmap of mine is that of Big Data-based services. Big Data simply refers to incredibly large data sets that are too cumbersome to accumulate let alone work with and make sense of. I am not so much interested in the companies developing infrastructure solutions to manage data, but rather companies that are developing new products services based on their ability to capture big data, synthesize and analyze it, and package it into a simple, yet valuable consumer products and services.

      Initially, the appeal lies with the fact that very often the data already exists, but is buried or otherwise inaccessible. Secondly, I am attracted by the idea that the product will strengthen with more use and over time creating a naturally widening lead over any aspiring competition(large or small). I am increasingly of the opinion that to be successful, in particularly out of Israel, web start-ups must either leverage strong technology and/or the power of the crowd to maintain a competitive advantage in the face of so much competition for customers and investment dollars.

      All of this is far from trivial, but Soluto aims to do just this. They start with a powerful, yet very intuitive download, which serves a dual purpose of providing a free boot utility to consumers, while capturing important data anonymously, not unlike anti-virus software. This “passive” crowd sourcing is valuable because Soluto has already built the backend of their service which knows how to make sense of the data for the creation of the second order product, which is the anti-frustration service. There is also “active” crowd sourcing through the techie users who can easily contribute their knowledge and solutions to the product.

      With a more than a billion PCs in use, most of them frustrated, the business opportunity is enormous. The intense demand explains the relative success of snake oil solutions like registry cleaners or extreme methods like repetitive reimaging. And as anti-virus increasingly becomes a commodity or outright free, anti-frustration software pioneered by Soluto, may be its natural successor.

      Even though it is still early days at Soluto, I continue to look for more companies that pursue similar strategies. In fact, I hope to announce my next “technology-enabled, crowd source-enhanced” web service investment soon. In the meantime, download and install Soluto!


                Why the iPhone's UI won't scale        
      For all its beauty and elegance, the iPhone's UI, in the state demoed on Apple's website and at Macworld, has at least two fundamental issues, even disregarding the whole touch-screen/haptics debate.

      These two issues are scalability and contextuality -- a lack of both. I'll address the first issue in this entry, and the second issue in a later entry.

      There are two areas where the iPhone UI will fail to scale.

      1. One-touch home page can't scale

      The iPhone relies on being a feature phone (not a smartphone, see my previous entry) to implement Steve Jobs's vaunted two clicks from anywhere UI functionality. If you add extra apps, for example a pedometer, a finances app, a possessions (eg. books, CDs, etc.) database, an ebook reader, a word processor, a spreadsheet, a presentation app, a dictionary, etc. etc. you will quickly run out of screen real estate.

      When that happens, you have two choices:
      • Add a scroll bar, which makes some items three clicks away (tap the home button, tap the scroll bar or "flick scroll", tap the icon) or more
      • Add folders, which makes items three taps away (tap home, tap the folder, tap the icon)

      Basically, there's nothing magic about Apple's "two-clicks to anywhere". It's just a result of crippleware.

      2. Flick-scrolling without context reduction only works for small datasets

      This one is a bigger problem for Apple. Pay careful attention to the demo of the Contacts application on the iPhone (available at http://www.apple.com/iphone/phone/). Notice that the app has no search icon or text pane. All it has is a list of contacts and the alphabet down the side.

      The demo shows how cool flick-scrolling looks. What it doesn't show is how painful it would be searching through a database of 400+ contacts (which is not a big database for many users, now that people sync with their PCs). Flick-scrolling is inherently imprecise, and thus a slow way to find a single item in a large dataset (which is mostly what you want to do with a contacts database on a phone).

      What's the fastest way to find a contact? Well, iContacts on the Mac actually has it built in: a filter that narrows the contacts list. It's called Spotlight, and is available on virtually every window on the Mac. However, it is conspicuously absent from the iPhone.

      (The reason it's absent from the iPhone is pretty easy to guess: Spotlight isn't much chop without a keyboard to enter text, but the iPhone doesn't have any ugly plastic buttons, so if you want to enter text on it, your usable screen space suddenly vanishes away, eaten up by an ugly onscreen keyboard -- have a look at the SMS demo at the iPhone site. So filtering a large list by entering text is not something that the iPhone's form-factor is very good at.)

      Unfortunately, flick-scrolling really isn't a substitute for Spotlight-style filtering for two reasons:

      1. Flick-scrolling is imprecise. I've already mentioned that this imprecision makes navigating to a single contact a pain. It's hard to describe, and, until I've played with an iPhone, I can't be sure just how painful flick-scrolling will be, but I'm pretty sure it'll be painful. Even if flick-scrolling is magically wonderful, there's still another reason why it's vastly inferior to filtering a long list. It is telling that the contacts list in Apple's demo is pretty small.
      2. Long lists are hard to visually search. The item you're looking for just gets lost in the midst of the huge number of items you're looking through. Humans are very good at pattern matching, but even humans get overwhelmed if there are simply too many candidates to match against, and scrolling doesn't reduce the candidate pool.

      This is where filtering really makes its money: it reduces the context to the minimal, useful context. If I'm looking for my contact in my database, all the other Lithgows overwhelm it. Even if I can tap on "L", I'm still faced with a lot of distractingly similar near-matches. But if I can filter for "Malcolm", then I can remove all of them with seven touches (in fact, I can remove pretty much all of them with three or four touches: "mal" or "malc"). Then I don't have to scour the list, I simply choose the only option.

      The inherent lack of this capability in the iPhone's UI will make for a frustrating experience for people who have any significant amount of data. The iPhone thus limits itself to toy status (much as the Newton did up until it's swansong with the MP 2000).

      Can Apple fix this? Yes, they can, but fixing involves moving back towards standard PDA interfaces, either providing a physical keyboard (unlikely), or providing some form of touch-input for letters (there are many innovative solutions out there, check out Ring-Writer, for example).

      But there are other problems with the iPhone's UI that indicate that Apple has been thinking more about glamour than substance. The major one is the lack of contextuality, and I'll be talking about that next. Stay tuned.


                Cray Targets Oil and Gas Sector’s Big Data Needs        

      Supercomputer-maker Cray is helping oil and gas companies benefit from the most-advanced reservoir modeling approach yet. Called Permanent Reservoir Monitoring, or PRM, the technique requires innovative data warehousing technology and data analysis techniques.

      The post Cray Targets Oil and Gas Sector’s Big Data Needs appeared first on HPCwire.


                Sr Front End Engineer - Zartis - Madrid        
      We are looking for a Front end engineer for one of our clients, a SW Disruptive Big Data Startup based in the city center of Madrid. You would be part of an innovative team. Responsabilities: Your main responsibility will be to lead their system to next level , in terms of design, UX and architecture. Requirements: A minimum of 3 years of relevant experience building FE. Technologies: JS, CSS, Sass, FE frameworks (one of the following, at least: Angular, Ember, Vue, React),...
                DevOps Engineer - Zartis - Madrid        
      We are looking for a DevOps Specialist for one of our clients, a SW Disruptive Big Data Startup based in the city center of Madrid. You would be part of a team in charge of supporting the company infrastructure and the systems associated. Requirements: To hold a Bachelor Degree in Computer Science or similar. A passion for Unix/Linux (required some experience withUbuntu/Debian and MacOS). Experience with cloud. Experience managing a continuous delivery environments. Our client...
                FirstAID 5.0: Firebird 3.0 and InterBase XE7 and support of 100+Gb databases        
      We are proud to present IBSurgeon FirstAID 5.0 – the new version of the recovery software with the highest rate of successful repairs. FirstAID 5.0 is a major improvement: now it supports Firebird 3.0, InterBase XE7, and big databases (100Gb+). Download IBSurgeon FirstAID 5.0 If you are a user of FirstAID version 3.x or 4.x, you can log into IBSurgeon
                Big data, cool kids        
      The big data world is a confusing place. We’re no longer in a market dominated mostly by relational databases, and the alternatives have multiplied in a baby boom of diversity. These child prodigies of the data scene show great promise …
                Five big data predictions for 2013        
      Here are some of the key big data themes I expect to dominate 2013, and of course will be covering in Strata. Emergence of a big data architecture The coming year will mark the graduation for many big data pilot …
                Why big data is big: the digital nervous system        
      Where does all the data in “big data” come from? And why isn’t big data just a concern for companies such as Facebook and Google? The answer is that the web companies are the forerunners. Driven by social, mobile, and …
                Big Data Online Training and Placement Assistance -         
      Hadoop has a huge scope for evolution and the demand for Big Data Hadoop professionals is and will be on a rise. Our Hadoop Big DATA course is just what you n...
                Democracy at risk: the terrifying power of 'big data'        
      The game of political campaigning has changed, and almost everyone is still playing the old game.
                Visualization – Big Data – Analytics – BlackHat US Workshop        
      Visual Analytics Workshop at BlackHat Las Vegas 2017. Sign up today! Once again, at BlackHat Las Vegas, I will be teaching the Visual Analytics for Security Workshop. This is the 5th year in a row that I’ll be teaching this class at BlackHat US. Overall, it’s the 29th! time that I’ll be teaching this workshop. Every […]
                A Bright Future: Innovation Transforming Public Health in Chicago        
      imageBig cities continue to be centers for innovative solutions and services. Governments are quickly identifying opportunities to take advantage of this energy and revolutionize the means by which they deliver services to the public. The governmental public health sector is rapidly evolving in this respect, and Chicago is an emerging example of some of the changes to come. Governments are gradually adopting innovative informatics and big data tools and strategies, led by pioneering jurisdictions that are piecing together the standards, policy frameworks, and leadership structures fundamental to effective analytics use. They give an enticing glimpse of the technology's potential and a sense of the challenges that stand in the way. This is a rapidly evolving environment, and cities can work with partners to capitalize on the innovative energies of civic tech communities, health care systems, and emerging markets to introduce new methods to solve old problems.
                2017 Hiring Trends in Cybersecurity        
      It’s no secret that cybersecurity is the single largest challenge facing CIOs and tech leaders in 2017. With mass digitalization, exponential increases in big data volume, the growing popularity of […]
                Episode 20 – Glenn Block is a fan of the mentoring culture - The Mentoring Developers Podcast with Arsalan Ahmed: Interviews with mentors and apprentices | Career and Technical Advice | Diversity in Software | Struggles, Anxieties, and Career Choices        
      Glenn’s Bio: By day Glenn Block works at Splunk making it easier for developers to work with Big Data as he drives the development of Splunk’s Dev platform. By night, Glenn is an active maintainer and contributor of several OSS projects including scripts (https://github.com/scriptcs/scri…). He is a polyglot with his most recent favorite language being...
                A New, Comprehensive, 4-In-1, Big Data Resource To Aid Color Innovation        
      Unusual new, global big color database to aid color decision-making including competitive intelligence across industries, on-line trend "listening", color trademarks by industry and country, and global color research studies
                Checking the Health of the Economy        
      Checking the Health of the Economy Modern society lives with Big Data and statistics, but every statistic has a story behind it. Though the United States economy is improving, there are some numbers or
                Big Data’s Relationship with The Internet of Things        

      I may be a couple of days late of Valentine’s Day, but there is a serious love fest between Big Data and The Internet of Things. What is Big Data? The Wikipedia says: Big data is a term applied to data sets whose size is beyond the ability of commonly used software tools to capture, […]

      The post Big Data’s Relationship with The Internet of Things appeared first on ioBridge Blog.


                Datenfluss gleich Transportfluss        

      Die Transportwirtschaft und die urbane Logistik profitieren von der Telematik. Auch übermorgen noch? Neue Ansätze weisen einen spannenden Weg in die Zukunft, zum Beispiel in Richtung Physical Internet. Foto: iStock/Alija

      Telematik? In Zeiten von Digitalisierung, Logistik 4.0, Big Data, E-Commerce und Echtzeit-Avisierung wirkt dieser Begriff veraltet.Read more


                Big Data and the CMO: An Introduction to the Challenge and the Opportunity        
      The amount of data being generated is increasing by orders of magnitude year-over-year. Traditionally, this hasn't affected organizations much as they have the data that they are required to keep, the primary data that ties ...
                #SQLSaturday 459 Madrid 2015        
      SQLSaturday es un evento gratuito para profesionales y futuros profesionales relacionados con SQL Server, Big Data, Business Intelligence e IoT que se celebrará el 21 de Noviembre 2015 en Madrid. Se han celebrado más de 400 eventos SQLSaturday alrededor de todo el mundo y ésta va a ser la segunda edición en España, donde el  […]
                Babbel Closes $10 Million Series B Funding Round        

      Babbel, the online learning system for foreign languages, today announced the closing of a series B funding round. Leading the round is Reed Elsevier Ventures. Other investors include Nokia Growth Partners as well as existing investors, IBB Beteiligungsgesellschaft via its VC Fonds Technologie Berlin, and Kizoo Technology Ventures. The investment will be used to accelerate international expansion and improve the adaptation to all relevant mobile and online platforms.

      Present in more than 190 countries, the Berlin-based startup’s strongest footprint to date has so far been in the German market. Now it will aggressively enter other European countries, the Americas and emerging markets. Babbel will also extend its partnerships with different hardware manufacturers, platform providers and media across the world.

      Babbel.com is operated by Lesson Nine GmbH, Berlin. The company had previously raised a total of $2.2M in equity and debt and has experienced rapid revenue growth of over 200% per year since 2011. Recently, Lesson Nine announced the [acquisition of San Francisco based competitor PlaySay] (http://press.babbel.com/en/releases/2013-03-21-education-startup-babbel.com-acquires-san-francisco-based-playsay.html). Unrelated to the new investment, the deal was made with operating cash flow.

      The basis for the language learning system’s success is its consistent use of mobile and Internet technologies and the integration of modern, practical learning content that motivates and guides the learner in an entertaining way. Over 6500 tailor-made learning hours for thirteen languages are available to learners online, as an iPad app and as free vocabulary training apps for iPhone, Android and Windows 8. As of today, the apps have been downloaded over 8 million times.

      “Babbel is a European digital media success story and I am delighted that we are joining the investor group at this exciting time”, says Tony Askew, General Partner at Reed Elsevier Ventures. “The startup has grown rapidly to over 15 million users and has built a large subscriber base which generates positive cash flow. Babbel’s excellent mobile and online products consistently rate as consumer favorites and Babbel is very well-positioned for explosive growth in the rapidly growing category of mobile and online language learning.”

      “Nokia Growth Partners believes that in a converged digital world, every business must be mobile and this principle drives our investments,” adds Walter Masalin, principal at Nokia Growth Partners. “As mobile transforms the way people learn, Babbel’s flexible and efficient solution supporting multiple platforms means it is well positioned to capitalize on this trend.”

      “Since our investment in 2008, Lesson Nine was already able to successfully occupy various markets with its innovative products, and has established itself worldwide as a serious player in the realm of mobile language learning”, says Marco Zeller, Managing Director of IBB Beteiligungsgesellschaft mbh. „This funding round, including other international investors, honors the Berlin company’s extremely positive development, and creates a foundation for even more dynamic growth. We are proud to have been on board with this success story since the beginning, and also to provide more capital as part of this round.”

      Michael Greve, CEO of Kizoo Technology Ventures says, “since we started working with Babbel five years ago, the product has made an exciting journey from a nice web tool to a modern and fun language learning experience with a huge user base eager to subscribe for the service. I believe the ideal platform for language learning is tablets and we can expect an even more accelerated growth of the beautiful mobile Babbel products in the future.”

      “We are happy to have two new high-profile international investors on board. This financing round opens a great number of opportunities without limiting our strategic options. The renewed participation of existing investors IBB Beteiligungsgesellschaft and Kizoo also pleases me. For our great team of seventy people, there’s still much to be done and much to achieve,” says Markus Witte, CEO of Lesson Nine GmbH.

      About babbel.com: Babbel is the new way to learn languages. With the online language learning system, both beginners and continuing learners can study French, Spanish, Italian, Brazilian Portuguese, Swedish, German, Dutch, Indonesian, Polish, Turkish, Norwegian, Danish and English with the help of interactive listening, writing and speaking exercises. The website babbel.com offers numerous online courses. In addition there are apps for iPad, iPhone, iPod, Android and Windows 8 devices, as well as interactive eBooks. More than 15 million people from over 190 countries are already learning a language with Babbel. Babbel is operated by Lesson Nine GmbH, Berlin. The company was founded in August, 2007, and now has around 170 employees and freelancers. Since March, 2013, Lesson Nine has been involved with Reed Elsevier Ventures, Nokia Growth Partners, Kizoo Technology Ventures and IBB Beteiligungsgesellschaft via its VC Fonds Technologie Berlin. Further information at: http://www.babbel.com

      About Reed Elsevier Ventures: Founded in 2000, Reed Elsevier Ventures is a venture capital firm based in London and San Francisco and backed by one of the world’s most successful media and information companies, Reed Elsevier. Reed Elsevier Ventures invests in talented and ambitious entrepreneurs and management teams who have the drive to build large, scalable businesses and the determination to become industry leaders. Reed Elsevier Ventures focus on high growth, internet, media and technology businesses based in the US, Europe or Israel in sectors such as big data and analytics, mobile, new media, healthcare information and groundbreaking analytic technologies. Example portfolio companies include Palantir, one of Silicon Valley’s most valuable technology companies, and Babylon, the world’s most downloaded language translation tool.

      About Nokia Growth Partners: Nokia Growth Partners invests in companies that are changing the face of mobility, communications and the internet. NGP offers industry expertise, capital and an extensive network, enabling entrepreneurs to build disruptive, industry-changing companies and take them to the global market. With offices in the US, Europe, India and China, NGP extends the reach of companies making their products and services local everywhere. Visit http://www.nokiagrowthpartners.com/ for more information.

      About IBB Beteiligungsgesellschaft mbH: The IBB Beteiligungsgesellschaft provides venture capital to innovative Berlin enterprises and has established itself as a market leader in the field of early stage financing in the location Berlin. The funds are used primarily for the development and market launch of innovative products or services, as well as for business concepts of creative industries. Currently two of the funds managed by the IBB Beteiligungsgesellschaft are in the investment phase, the VC Fonds Technologie Berlin with a fund size of € 52 million and the VC Fonds Kreativwirtschaft Berlin with a fund size of € 30 million. Both VC funds are financed by means of the Investitionsbank Berlin (IBB) and the European Fund for Regional Development (EFRE) administered by the State Berlin. Since 1997 the IBB Beteiligungsgesellschaft Berlin, in consortia with partners, has made 850 million € available to creative and technology-orientated companies; thereof, the portion invested by IBB Beteiligungsgesellschaft itself, as lead, co-lead or co-investor, was approximately 116 million €.

      About Kizoo Technology Ventures: Kizoo helps young start-up teams grow. As a seed and early stage investor with a focus on SaaS, Internet & Mobile Services and Social Applications Kizoo is happy to share its own longtime experience in development, marketing and product management in those markets.
 www.kizoo.de


                Los operadores podrían ingresar 85.000 millones de dólares gracias a IoT y Big Data        
      Un estudio de Juniper Research ha calculado que los operadores de redes móviles pueden obtener 85.000 millones de dólares adicionales en ingresos durante los próximos cinco años a través del despliegue y mejora de servicios no básicos, incluyendo análisis de datos de Big Data y IoT (Internet de las cosas).
                Unidad de servicios Big Data para clientes corporativos de Telefónica        
      LUCA cubre todas las necesidades de Big Data de empresas y organizaciones, desde la gestión y analítica de los datos al uso de herramientas e infraestructuras
                Movistar+ mejora la personalización de contenidos con herramientas Big Data        
      El nuevo servicio se acerca a los gustos de cada cliente y construye una propuesta personalizada
                Ingeniero/a Big Data - Gupo DIA - Las Rozas (Madrid)        
      Precisamos un Ingeniero/a Big Data. Funciones: Asegurar que la plataforma diseñada cumple las expectativas y requerimientos de la organización en cuanto al ensamblaje de modelos analíticos propios y externos. Participar en el dimensionamiento ,diseño y desarrollo de las arquitecturas, en función de los datos y necesidades analizadas, basadas en las distribuciones: AWS, Cloudera, Hortonworks, Mapr, ... Análisis predictivo y minería del dato: modelos analíticos y motores de...
                Data Scientist: come valorizzare i dati aziendali        

      Data Scientist: come valorizzare i dati. L’arte di sfruttare l’asset dei Big data come fonte per sperimentare e innovare i modelli di business dei Brand e favorire la creazione di valore aggiunto. Analisi e riflessioni sulla figura emergente del Data Scientist. Un articolo ricco di informazioni e curiosità architettate con il supporto pratico e rilevante […]

      L'articolo Data Scientist: come valorizzare i dati aziendali sembra essere il primo su B2corporate.


                Statisticians and "Big Data" Analysts in High Demand        
      When I was a graduate student back in the dark ages, I took an advanced statistics course and then briefly worked in a laboratory where statistical analysis of data derived from animal models of disea... click here to continue
                By: Krishna Sankar        
      Yep, excellent set of topics. We have moving past the 3Vs of Big Data to the Three Amigos of Big Data - Interface, Intelligence & Inference. http://goo.gl/NqOWQ Couple of observations : Wearable Devices is missing from the list. Deep Learning definitely is an emerging topic, something close to my heart. I think Analytic Sandbox supported by a Data Landing Zone is more accurate than convergence of databases. Lastly the larger picture is the Big data pipeline spanning Data management & Data Science viz. Collect-Store-Transform-Model-Reason-Visualize/Predict/Recommend-Explore. Cheers
                Motivational Keynote Speaker in Bangalore, India        
        Patrick Schwerdtfeger is available to speak in Bangalore as a leading authority on technology (including ‘big data’ and predictive analytics) and global business trends (including demographics, particularly in south Asia). He’s the author of the award-winning book Marketing Shortcuts for the Self-Employed (2011, Wiley) and a regular speaker for Bloomberg TV. If you are […]
                FRSiH 2016: Przemysł spożywczy. Biznes 4.0 - nowe technologie czynnikiem konkurencyjności. Część II - pełna relacja        
      Procesy automatyzacyjne i wieloaspektowa analiza danych w oparciu o Big Data pozwalają firmom kontrolować procesy związane zarówno z przepływem towaru, jak i transakcjami gotówkowymi. Wciąż jednak do całkowitej automatyzacji nam daleko. Dlatego warto pamiętać o potrzebach pracowników, którzy, bez odpowiedniego włączenia w nowoczesne procesy, mogą pozostać na marginesie przemysłowej rewolucji - Przemysł spożywczy. Biznes 4.0 - nowe technologie czynnikiem konkurencyjności. Część II
                The 11 Steps Established Vendors Go Through when Something New Comes Along        
      As an analyst, I often see new Big Things coming by. Things like cloud, big data, mobility, even going back to client-server. The reaction of established vendors to these Big Things is pretty predictable, even depressingly so. Here is what it looks like, where X = whatever is new. X will never work No one […]
                What else? - a great coaching question but what else?         
      Julie Johnson, a business associate of mine, posted this piece on her blog about the 'What else?' question in a coaching conversation:



      Imagine that you are coaching someone, and you both agree that it is time to focus on generating possible solutions to the challenge at hand. So you ask your coachee: “How can you achieve this goal?”  Without any hesitation, you receive an answer. What do you do next?

      Let’s take a case in point. A few years ago I was coaching someone who wanted to get better at giving strategic presentations, especially to senior management. We had already explored what had gone well and less well in the past, conditions that have an impact on performance, the advantages to achieving the goal and the disadvantages of not achieving it. By this point, his motivation was solidly in place. With both of us keen to get to solutions, the conversation went something like this: 



      I asked, â€œWhat can you do to improve your presentations skills when presenting to senior management?” My coachee quickly replied, “I can take a course.” 


      Tempted to explore possible courses, and whether there was a budget for a course available, etc. etc., I simply made note and asked, “What else?”  He quickly replied, “I can get a presentation coach.” 


      I thought about exploring the qualities of the ideal presentation coach, but didn’t.  Instead I inquired, “And what else can you do?” There was a slight pause, and then he answered, “Well, I could go on YouTube and check out the techniques of some of my favorite speakers. [pause] And TedTalks. Mmm. I’d like that.” 


      I noted once more, and then said, “What other things might you do?” There was a significant pause, during which he looked out the window. Then he said, “David. He is quite good. I’d love to have coffee with him and pick his brain. [pause] And I really need to watch him more consciously when he presents next time, and figure out what it is he is doing exactly that works so well.”


      “Mmmm.” I said, noting these new ideas. “And what else would work for you?” This pause was even longer, and I waited. Finally he said, “Well, a couple of my team members have attended some senior management meetings, and they’ve seen me in action.  I bet they would be happy to give me candid feedback and suggestions.”


      Tempted to ask who he might speak with and what questions he might ask, I just said “Ok. Anything else?” After a very, very long silence, he said “Well frankly, if I am really serious about this, I should practice my next presentation several times before I actually have to give it. [pause] I could even film myself. Yes! Yes! It would be so useful to observe myself in action! Then, when I finally like what I see, I will have the confidence to do a repeat performance when it really matters!”


      When he was out of ideas, we reviewed each option he had generated, and then moved eagerly on to action planning.

      While some of those post-question silences were pretty long, I don’t even think that my coachee noticed them. He was very busy creating. His first ideas were probably not new, because his answers came immediately after the question was posed. But because I kept asking the same question (with different words) over and over again, his mind kept creating, and the pauses between question and answer got longer and longer.

      My general guideline in these situations is “The longer the silence, the newer the idea.” There are two things to avoid once you have carefully crafted this creative moment:
      • Don’t grab one idea and analyze it in detail – leave that for later once all the ideas are on the table.

      • Keep in mind that the longer the silence after your question, the harder your coachee is probably thinking, and therefore creating. If your question is followed by silence, you are probably ‘on a roll’! This is the best confirmation that your question is a good one!
       Link to Julie Johnson's blog


      I like the narrative style.  It's practical and gives a good insight into the judgements that a coach continually has to make about the nature of their interventions.  I shared the blog with some managers that I had been working with on coaching practices the day after the blog was posted and I received this response from one of the recipients shortly afterwards: 

      Thanks John.  I actually used this approach with one of my team – it worked brilliantly, and almost as set out above.

      ...nice feedback and evidence of the importance of sharing ideas and practices.

      Coaching practice - what else do we need to consider?


      Through my masters studies I looked in some detail at the field of conversation analysis and ethnomethodology and the structuring of sense-making that is part of everyday conversational interaction. If you are interested in following this in more detail please go to my blog Observing Practice.

      Of particular interest in Julie's account is the description of the silences. What's noticeable is the work that's going on in the silences.  

      The 'what else?' question is an effective device to stimulate thought and the skill of the coach is to hold the pause to allow the thinking work to develop - "There was a significant pause, during which he looked out the window".  However, what we can't tell from this is what the coach was doing whilst the coachee was looking out of the window.  My expectation is that the coach was helping to maintain the silence by following good listening practices like maintaining eye contact and avoiding non-verbal gestures or movements that might distract the silence.  My point is that these practices are being taken-for-granted but they are as much a part of the ordering process to accomplish an effective coaching intervention as the powerfulness of the question itself.


      Why is this important?


      Coaching is an important approach to helping facilitate change in individuals and teams.  However, like other management practices, it has a kind of mysterious 'black box' quality to it; coaching is not accomplished through a model or a set of questions or behaviours but through a choreography of fine-grained actions that emerge situationally each and every time a coach works with an individual or a team.  In other words, however good 'what else?' is as a question - and it's one that I often use too when I'm coaching - it rather glosses over a lot of important but unseen work that is also contributing to the result.  


      The challenge of capturing everyday action 


      My continuing interest in conversation analysis is the opportunity that it offers to study the choreography and use it as a learning tool to  enhance the development of coaches and coaching practice. 

      The challenge is that this choreography slips by too quickly and is too nuanced; and to be of use the action would need to be captured on video or audio and then analysed to produce a micro level detail of practice.  I know from personal experience that the analysis takes a great deal of time and, at present, is too onerous to be of practical use.  

      However, I am hopeful.  Big data technologies are now emerging that are able to provide information about, inter alia, workplace interactions  - for an example see the HBR blog The new science of building great teams and the use of electronic badges to gather interactional data.  This type of analysis looks very promising and is producing some ground-breaking insight into how people interact.  It forms part of some work that is being described as social physics that is coming out of MIT.  I'm reading up on this at the moment and will write a further post of my sensing making on this topic.




                Oracle Big Data Cloud Service CE: Working with Hive, Spark and Zeppelin 0.7        
      In my previous post, I mentioned that Oracle Big Data Cloud Service – Compute Edition started to come with Zeppelin 0.7 and the version 0.7 does not have HIVE interpreter. It means we won’t be able to use “%hive” blocks to run queries for Apache Hive. Instead of “%hive” blocks, we can use JDBC interpreter […]
                Oracle BDCSCE Upgraded: Zeppelin 0.7 and Spark 2.1        
      Last week, Oracle Big Data Cloud Service – Compute Edition was upgraded from 17.2.5 to 17.3.1-20. I do not know if the new version is still in testing phase and available to only trial users, but sooner or later the new version will be available to all Oracle Cloud users. The new version is still […]
                Harvard, Stanford, UC Berkeley Take Center Stage at Upcoming HIMSS Big Data & Healthcare Analytics Forum May 15-16        

      HIMSS Gathers Top Data Scientists and Healthcare Pros To Talk Putting Data To Work to Reduce Costs and Improve Patient Care

      (PRWeb May 02, 2017)

      Read the full story at http://www.prweb.com/releases/2017/05/prweb14294220.htm


                UC Berkeley’s Stefano M. Bertozzi to Keynote HIMSS Big Data & Healthcare Analytics Forum in May        

      HIMSS Unveils Full Speaker Lineup Featuring Healthcare Data Experts from Advocate Health Care, Harvard Medical School, Kaiser Permanente, Sutter Health, and Sanford University School of Medicine

      (PRWeb March 31, 2017)

      Read the full story at http://www.prweb.com/releases/2017/04/prweb14200579.htm


                HIMSS Announces Speaker Lineup of National Experts for its Big Data & Healthcare Analytics Forum in June        

      Discover How Innovators at Advocate Health Care, Geisinger, Mayo Clinic, Penn Medicine, and UCSF are Putting Data to Work to Improve Patient Care, Safety, and Outcomes

      (PRWeb April 22, 2016)

      Read the full story at http://www.prweb.com/releases/2016/04/prweb13362072.htm


                HIMSS' Big Data and Healthcare Analytics Forum: A Deep Dive at the Right Time        

      November Event to Showcase Vital New Analytics Opportunities for Healthcare IT

      (PRWeb October 08, 2015)

      Read the full story at http://www.prweb.com/releases/2015/10/prweb13011864.htm


                The human insights missing from big data | Tricia Wang        
      Why do so many companies make bad decisions, even with access to unprecedented amounts of data? With stories from Nokia to Netflix to the oracles of ancient Greece, Tricia Wang demystifies big data and identifies its pitfalls, suggesting that we focus instead on "thick data" -- precious, unquantifiable insights from actual people -- to make the right business decisions and thrive in the unknown.
                Openbravo partners with Happiest Minds Technologies to provide cutting edge OmniChannel Solutions to Retailers Worldwide        

      Barcelona, Spain & Bengaluru, India, August 27th 2015 Openbravo, the provider of the preferred Commerce Suite and Business Suite, today, announced a strategic partnership with Happiest Minds Technologies, a next generation Digital Transformation, Infrastructure & Security and Product Engineering Services Company. "Openbravo welcomes the strategic partnership with Happiest Minds and both organizations are engaging in a great working relationship to bring cutting-edge technological advancements to Openbravo's retail offerings. Our global presence plus the importance and international reach of the Openbravo Commerce Suite clients in the retail space makes us uniquely positioned to work together with a partner with the stature of Happiest Minds. We believe that Retail Customers, several of which are true market leaders, can benefit greatly from this alliance", said Marco de Vries, CEO, Openbravo. "We were looking for the right partner to build a unique proposition for retailers to help them rapidly improve their customer buying experience within an OmniChannel framework. With Openbravo, we found the right solution at a competitive price and flexibility to extend it to provide a differentiated offering to our Retail Customers ", said Salil Godika, Chief Strategy & Marketing Officer and Retail Industry Group Head of Happiest Minds. "Happiest Minds and Openbravo are focused on the OmniChannel framework where the idea is to enhance the overall performance of a retailer and help its Retailer Customers bring the consumer back into their stores by providing a unique experience. We are working together in areas such as OmniChannel Fulfillment, Retail IoT and Advanced Analytics" - Said Sunando Banerjee, Channel Business Manager at Openbravo. The Openbravo Commerce Suite is a multichannel retail business solution built on top of a truly modular, mobile-enabled and cloud-ready technology platform. The platform allows retailers to transform their physical store channel and do more and faster, with lower risks.   About Openbravo Openbravo is a world leader in the commercial open source software space helping midsize to large organizations in 60+ countries around the globe successfully manage continuous change and innovation by providing business management solutions that deliver a high degree of agility, responsiveness and usability, including a state-of-the-art multichannel retail solution, the Openbravo Commerce Suite, and a global management solution, the Openbravo Business Suite, both built on top of a highly flexible and extendible platform that allows companies a greater focus on differentiation and innovation. Openbravo solutions are exclusively distributed through a network of Official Openbravo Partners. Openbravo has offices in India, France, Mexico and Spain. About Happiest Minds Technologies Happiest Minds enables Digital Transformation for enterprises and technology providers by delivering seamless customer experience, business efficiency and actionable insights through an integrated set of disruptive technologies: Big Data Analytics, Internet of Things, Mobility, Cloud, Security, Unified Communications, etc. Happiest Minds offers domain centric solutions applying skills, IPs and functional expertise in IT Services, Product Engineering, Infrastructure Management and Security. These services have applicability across industry sectors such as retail, consumer packaged goods, e-commerce, banking, insurance, hi-tech, engineering R&D, manufacturing, automotive and travel/transportation/hospitality. Headquartered in Bangalore, India, Happiest Minds has operations in the US, UK, Singapore, Australia and has secured $ 52.5 million Series-A funding. Its investors are JPMorgan Private Equity Group, Intel Capital and Ashok Soota.

      più info


                Big Data Study Reveals Possible Subtypes of Type 2 Diabetes        
      NIH Director’s Blog, 10 November 2015 In recent years, there’s been a lot of talk about how “Big Data” stands to revolutionize biomedical research. Indeed, we’ve already gained many new insights into health and disease thanks to the power of new technologies to generate astonishing amounts of molecular data—DNA sequences, epigenetic marks, and metabolic signatures, to name […]
                What are the characteristics of “true clusters?”        

      In the digital world where billions of customers are making trillions of visits on a multi-channel marketing environment, big data has drawn researchers’ attention all over the world. Customers leave behind a huge trail of data volumes in digital channels. It is becoming an extremely difficult task finding the right [...]

      What are the characteristics of “true clusters?” was published on SAS Users.


                Rated 5 - Nature Extra: Big Data        
      Big Data: As Google celebrates its 10th anniversary, we find out how science is coping with massive datasets generated by unprecedented computing power. BoingBoing blogger Cory Doctorow tells us about his visits to the LHC data storage facility and the genome sequencing Sanger Centre.

      Posted on Sep 3, 2008 at 18:00 in Nature Podcast [ Subscribe ]


                Pivotal CEO Paul Maritz goals of Big Data…        
      CloudOrigin: Pivotal CEO @Paul_Maritz goal of #BigData is to catch customers in the act – speaking at …
                Gary Marcus on the Future of Artificial Intelligence and the Brain        
      Gary Marcus of New York University talks with EconTalk host Russ Roberts about the future of artificial intelligence (AI). While Marcus is concerned about how advances in AI might hurt human flourishing, he argues that truly transformative smart machines are still a long way away and that to date, the exponential improvements in technology have been in hardware, not software. Marcus proposes ways to raise standards in programming to reduce mistakes that would have catastrophic effects if advanced AI does come to fruition. The two also discuss "big data's" emphasis on correlations, and how that leaves much to be desired.
                BSG Financial Group to Host Webinar About Harnessing Better Data, Not ‘Big Data’ for Performance Improvement        

      Session Will Help Financial Institutions Identify and Apply Data That is Currently Available Them to Compete More Effectively

      (PRWeb March 21, 2017)

      Read the full story at http://www.prweb.com/releases/2017/03/prweb14170253.htm


                August 8, 2017 – “How Big Data & Advanced Analytics Can Improve Population Health; Now and in the Near Future”        
      William Kassler, MD, MPH Deputy Chief Health Officer Lead Population Health Officer IBM Watson Health TUESDAY, AUGUST 8, 2017 12:00-1:00 PM (LUNCH PROVIDED FOR FIRST 50 GUESTS) MAURER CENTER FOR PUBLIC HEALTH ROOM 3013 We are poised in a new era of technology and big data. This presentation will review sophisticated efforts that build on increased availability of big data and improved analytics, and will describe how these efforts support integration of personal, practice, system, and community programs to improve individual and population health. Objectives:... Continue Reading
                SQL Server 2017 Integration Services Cookbook        

      I coauthored my 15th bookSmile Together with Christian Cote (lead author) and Matija Lah (coauthor) we publishes SQL Server 2017 Integration Services Cookbook. Of course, it is kind of early to say this is a definitive guide to SSIS 2017. More accurate name would be SSIS 2016 / 2017 Cookbook. Besides detailed guidelines how to use the 2016 version, you will also find a chapter on some new information on scaling out SSIS 2017. In the future, we will add an online chapter, if it will be needed, about additional new SSIS 2017 functionalities. Anyway, here is a brief description of the chapters.

      Chapter 1: SSIS Setup

      This chapter will describe step by step how to setup SQL Server 2016 to get the features that are used in the book.

      Chapter 2: What is New in SSIS 2016

      This chapter is an overview of Integration Services 2016 new features. Some of the topics covered here are covered extensively later in the book.

      Chapter 3: Key Components of a Modern ETL Solution

      This chapter will explain how ETL has evolved over the past few years and will explain what components are necessary to get a modern scalable ETL solution that fits the modern data warehouse.

      Chapter 4: Data Warehouse Loading Techniques

      This chapter describes many patterns used when it comes to data warehouse (DW) or operational data store (ODS) load.

      Chapter 5: Dealing with Data Quality

      This chapter will describe how SSIS, DQS and MDS can be leveraged to validate, cleanse, maintain, and load data.

      Chapter 6: SSIS Performance and Scalability

      This chapter talks about how to monitor SSIS package execution. It also provides solutions to scale out processes by using parallelism. Readers learn how to identify bottlenecks and how to resolve them using various techniques.

      Chapter 7: Unleash the Power of SSIS Script Task and Component

      Readers learn how script tasks and script components are very valuable in many situations to overcome the limitations of stock toolbox tasks and transforms.

      Chapter 8: SSIS and Advanced Analytics

      This chapter talks about using SSIS to prepare data for and do advanced analyses like data mining, machine learning, and text mining. Readers learn how sampling components can be used for preparing the training and the test set, how to use SQL Server Analysis Services data mining models, how to execute R code inside SSIS, and how to analyze texts with SSIS.

      Chapter 9: On-Premises and Azure Big Data Integration

      This chapter talks about the Azure Feature pack that allows SSIS to integrate Azure data from blob storage and HDInsight clusters. Readers learn how to use Azure feature pack components to add flexibility to their SSIS solution architecture.

      Chapter 10: Extending SSIS Custom Task and Transformations

      This chapter talks about extending and customize the toolbox using custom developed tasks and transforms.

      Chapter 11: Scale Out with SSIS 2017

      The last chapter is dedicated to SSIS 2017 and teaches you how to scale out SSIS package executions on multiple servers.

      Enjoy the reading!


                Biology: Assistant Professor of Biology - University of Richmond - Richmond, VA        
      We seek a biologist who has expertise in analysis of big data, modeling, bioinformatics, genomics/transcriptomics, biostatistics, or other quantitative and/or...
      From University of Richmond - Thu, 06 Jul 2017 23:17:18 GMT - View all Richmond, VA jobs
                Narro Reading of 5 Ways Big Data Is Transforming Finance        
      Listen on Narro As more companies embrace a Big Data approach, new trends and changes are in motion at this very moment and transforming finance
                Zaloni Named to Q3 2017 Constellation ShortList™ for Data Lake...        

      Zaloni continues to lead in big data management and data governance

      (PRWeb August 09, 2017)

      Read the full story at http://www.prweb.com/releases/2017/08/prweb14585392.htm


                MCH Strategic Data Connects Big Data, Email Marketing and Target...        

      MCH’s proprietary email scoring solution, eRespond, uses real-time data to rank and score K-12 educators in institutions most likely to respond to email offers. This gives marketing professionals the...

      (PRWeb August 07, 2017)

      Read the full story at http://www.prweb.com/releases/2017/08/prweb14560383.htm


                What is the Power of Cloud? (to the Business)        

      Of course many readers, technologistsand businesses are acutely aware of the value prop of the cloud but what aboutthe rest of the "majority" of people who don't? If you do a search onthis topic, you will mostly get links to marketing materials that are fairlyhigh level and full of the usual buzzwords with more links to lengthy casestudies. As a business technology enabler, there needs to be a simpler messageor elevator pitch that is easy to understand yet convincing. Ah, the simplicityor less is more approach rears it’sbeautiful head once again. Below are some recipes to draw from for an elevatorride of any duration.

      The Enterprise Cloud Reduces Business Outages Increasing Customer Satisfaction

      Put simply and avoiding technical acronyms, can the businessrely on 99.99+% availability from the cloud? Look no further than Netflix as an example becauseyes the cloud is certainly up to the task.

      Immediate Availability Gives You a Your Competitive Edge (for now)

      Back in the day, Dell drove industry change by assemblingcomputers on demand reducing inventories and wrecking havoc on computerstorefronts. Well the cloud equivalent is to be able to go to the cloud store,answer a few questions, enter your credit info and within 1 hour or less youare up and running.

      A Right Sized Business Reduces Time, Headaches and Cost

      You may wonder what this is? It takes many forms but to meyou have to ask does the business get what it paid for? No more shelf ware andcomplicated upgrade processes. You need more capacity for certain businesscycles, zoom it’s there. You need more storage, of course that is automated.You think Apple and other retailers have to turn on any switches to increasecapacity? Maybe they do now but they shouldn’t.

      Business Friendly User Interfaces Decreases Timeto Market

      Your thinking mobile and that is certainly true but it ismuch more than that. This is the type of disruptive technology that could puttechnologists like me out of work. Non-technical end users must be able tobuild and deploy apps that previously required a CS degree and many years ofexperience. Consider how Smart Data cloud initiatives are disrupting Big Data.Nate Silver, beware.

      Build a Nimble Business by Thinking SmallThe cloud is all about modularity, extensibilityand continuous release cycles. A/B testing drives micro feature planningrendering the traditional roadmap virtually useless. In a fully optimizedenvironment, features will show up before you even have a chance to request them.Social tools will drive this change more than traditional communicationchannels and you have to be on top of it because your competition sure is.

      Focus on Business but Engage Developers and IT to Modernize

      Remember, we are all in this new delicate cloud eco systemtogether so it is best to engage all parties even as you try to do an endaround them. Developers will be crucial allies in building a hybrid OnPremcloud solution and IT will help you track outages on cloud systems just likethey always have for internal systems.

      Be Business Tech Savvy to Extend Your Brand

      You may be asking yourself, how does this extend the brand?Think huge retailer with leading cloud PaaS market share. Make a point ofunderstanding these new disruptive forces of nature. Understand how NodeJS andNPM drive modularity and manage dependencies that reduce business risk. UtilizeGitHub to research and rate technologies that could impact and shorten your time tomarket. Don’t be passive, get a free developer account on Koding.com toexperience a PaaS system and write a Hello app in 5 different languages. Usethe force, do a lot of “What is XYZ” searches.

      Summary

      Transform your business but don’t forget to have some fun.


                Fighting for school libraries, public libraries, and using big data to do it        
      I spoke with Dustin Fife of Utah Library fame for his podcast and we discussed what we can do to fight for libraries.
                Cirugías mínimas que salvan vidas        
      Las operaciones quirúrgicas realizadas con el menor impacto posible sobre el cuerpo están cambiando la medicina para siempre. Robots de precisión, big data, previsualización holográfica y realidad virtual son algunos de los paradigmas tecnológicos clave para este tipo de intervenciones que reducen drásticamente los riesgos y dolores tiempos de recuperación para los pacientes.
                CyberCalifornia Launch        


      As we have seen in recent news headlines, security breaches can bring entire organizations, states and countries to their knees. In today's connected world, making security a top priority is no longer a choice - it's a must. As public and private organizations continue to operate within this new era of the Internet, security will become critical to maintaining trust with the public, building company reputation, as well as safeguarding data, IP and critical infrastructure.

      California is at the center of the digital revolution that is shaping the world around us. Already a national center of commercial cybersecurity activities, California is home to companies building the cybersecurity products and solutions that are securing commercial businesses, academic institutions and governmental organizations across the globe.

      In an effort to help advance the goals and promote the accomplishments of the Governor's Cybersecurity Task Force, CyberTECH, among other state and local leaders, recently launch CyberCalifornia.



      CyberCalifornia will organize public-private partnerships in cybersecurity, with the goals of facilitating research and innovation in cybersecurity, educating California businesses about cybersecurity needs and resources, and connecting California's robust workforce development system with the needs of California employers.

      Led by its Board of Advisors, CyberCalifornia activities include:
      Assisting in the organization of private sector advisory groups by vertical industry such as banking and finance, high technology, agriculture, etc.
      Assisting in the development and promotion of cybersecurity career pathways
      Partnering with local and regional economic development organizations to inform California's small business community about cybersecurity needs and solutions
      Establishing connections between the cybersecurity and Internet of Things sectors through activities such as conferences and media events

      To learn more about CyberCalifornia, please contact darin@cyberhivesandiego.org.



      Darin Andersen, CEO, CyberUnited, Co-Chair, CyberTECH, Co-Chair, Economic Development Subcommittee, California Cybersecurity Task Force

      "CyberCalifornia: Cybersecurity and IoT Gold Rush"

      Recently, CyberTECH helped launch CyberCalifornia with other State and local leaders. The initiative is organized in conjunction with the Innovative Hub (iHub) Network, a program administered by the State Office of Economic Development and in partnership with Governor Brown's Cybersecurity Task Force.


      Jerry Brown, Governor of California

      CyberCalifornia will organize public-private partnerships in cybersecurity to better protect California's critical infrastructure, businesses and citizens from cyber threats, facilitate research and innovation in cybersecurity, educate California businesses about cybersecurity needs and resources, and connect California's robust workforce development system with the needs of California employers.

      Center of Cybersecurity and Internet of Things Excellence (CCIoTE)
      California is home to the personal computer, the firewall, anti-virus and many other cybersecurity products. Today, California companies are at the forefront of new technologies ushering in the Internet of Things (IoT), the term for the phenomenon where people and things are connected to the Internet, leveraging sensors and real time analytics and cloud technologies.

      California's leadership role in advanced technology sectors including autonomous vehicles, biotechnology, precision in medicine and advanced manufacturing, will contribute to the State's continued excellence in cybersecurity and privacy. The powerful combination of cyber and the emergence of these innovative intensive sectors make California the perfect place to build secure next generation technologies.

      California has a rapidly growing information technology industry cluster and offers the full spectrum of cybersecurity capabilities. Our Golden State has tremendous assets to keep our Country safe, advance innovation with security and privacy built in and be a beacon for other States in our Nation to follow.


      Charles "Chuck" Brooks, Vice President, Government Relations and Marketing, Sutherland Global Services

      "Adopting a Cooperative Global Cyber Security Framework to Mitigate Cyber Threats (Before it is too Late)"
      The recent OPM cyber breach at the U.S. Government's Office of Personnel Management (OPM) provided a wakeup call to the seriousness and sophistication of the cyber security threat aimed at both the public and private sectors. The fact is that over 43% of companies had breaches last year (including mega companies such as Home Depot, JPMorgan, and Target). Moreover, the intrusion threats are not diminishing. For example, British Petroleum (BP) faces 50,000 attempts at cyber intrusion every day.

      According to the think tank Center for Strategic and International Studies (CSIS), cyber related crime now costs the global economy about $445 billion every year. These cyber security breaches demonstrate that there is a continued need for protocols and enhanced collaboration between government and industry.

      In 2014 code vulnerabilities such as Heartbleed, Shellshock, Wirelurker, POODLE and other open source repositories caused chaos and harm. The cyber security community responded to those vulnerabilities with "react and patch." Unfortunately, this means of response has been for the most part, a cosmetic or band aid approach.

      The cyber security community's posture must change to one of wait and react to that of being proactive and holistic. It is not really a question of which policies, processes and technologies are ready and best, that will always be debatable. Being proactive means adopting a working Industry and Government Global Cyber Security Framework that would include measures for encryption, authentication, biometrics, analytics, automated network security, and a whole host of other topics related to cyber threats.

      CONTINUE READING



      LIFARS, Featured CyberTECH Member

      LIFARS is a digital forensics and cybersecurity intelligence firm based in New York City. With its history of investigating cybersecurity breaches across a number of industries, LIFARS is uniquely positioned to help increase cybersecurity posture to protect organizations and individuals from real-life hackers and advanced persistent threat actors. By bringing in LIFARS, you can maximize your existing investment into the cybersecurity infrastructure and make sure that your future investments are strategically placed – delivering maximum protection while preserving the productivity of your employees. For these and other reasons, LIFARS was recently ranked as the #2 cybersecurity company in New York Metro area on the Cybersecurity 500 list.

      LIFARS WEBSITE

      In addition to providing robust security solutions based on best practices and personal hands-on experiences, LIFARS continuously explores the latest innovations in the cybersecurity field and always seeks to find what is shaping tomorrow's industry landscape. In a recent interview with Founder and CTO of LIFARS, Ondrej Krehel, and LIFARS' Digital Forensic Examiner, Paul Kubler, they discussed strategies and policies for cybersecurity in the world today, including common mistakes and how to make them right.

      LIFARS INTERVIEW




      NXT Robotics is a San Diego-based company that designs and builds service robots to support the increasing needs of the hospitality industry. NXT Robotics' service robot platforms are able to provide delivery, security and guest-related services to customers - all while maintaining a consistent and high degree of quality.

      The company's founder, Jeff Debrosse has over 20 years of software engineering, cybersecurity R&D and enterprise product management and deployment experience. "This is an exciting time for NXT Robotics," said Jeff. "With access to the CyberTECH community and its resources, our success is further guaranteed."

      The company will be providing CyberTECH's incubator and shared workspace offices, CyberHive and iHive, with its own Nixie. "Your team, tenants and guests will find Nixie to be amazingly pleasant to deal with - not to mention, very useful!" Jeff stated.

      "NXT Robotics understands the importance of making cyber part of the foundation. We are thrilled to have NXT Robotics join CyberTECH as a member and look forward to working closely with Jeff and his team." said CyberTECH Co-Chair and Founder, Darin Andersen.

      We are proud to recognize NXT Robotics as a featured CyberTECH Member for July 2015.



      NXT ROBOTICS WEBSITE




      Bird Rock Systems, Featured CyberTECH Member
      Bird Rock Systems is a company that has been built on a foundation of exceptional customer service, technology and long-term partnership. Bird Rock Systems excels at deploying the latest enterprise class technologies including: security, routing, switching, traffic management, WAN acceleration, wireless, IP communications, storage area networking, performance computing and virtualization. Bird Rock typically begins a new client engagement by completing a network or security assessment. Their many loyal customers represent enterprise business, casino, university, Fortune 500 and government organizations requiring 'Best in Class' secure technical solutions.

      BIRD ROCK SYSTEMS WEBSITE



      iWebGate, Featured CyberTECH Member
      Founded in Australia in 2004 with global corporate operations in North America, iWebGate has pioneered a new form of virtualization technology - the Virtualization of Network Services. iWebGate's LaunchPad allows organizations to properly and securely segment networks, connectivity and devices, eliminating the need for Firewalls and VPNs as primary security and connectivity solutions. By deploying the iWebGate Workspace Suite, organizations can then integrate security and business applications into the iWebGate LaunchPad transforming them from "enterprise friendly" products into "enterprise ready" solutions. The result is faster, more secure and reliable access to networks and network services.

      IWEBGATE WEBSITE



      San Diego Venture Group Selects Fhoosh as a 2015 Cool Company
      Cybersecurity software development firm FHOOSH, Inc. has been chosen as a "Cool Company" by the San Diego Venture Group (SDVG) for a second year. One of 31 Cool Companies selected this year from over 160 applicants, FHOOSH continues to represent the leading edge of San Diego-area tech innovation.

      FHOOSH helps corporations, institutions and government organizations protect and power valuable stored digital information with its cybersecurity platform and productivity software. FHOOSH bankLevel+ cybersecurity safeguards an organization's critical business and customer data from cyber threats by storing it in a state that is useless to hackers. It does this approximately five times faster than storing data unencrypted, with technology that breaks apart, disassociates, separately encrypts, and then disperses the data. The system also quickly notifies network administrators when unauthorized individuals try to access FHOOSH-protected databases, object stores and file systems. FHOOSH implements with existing infrastructure and allows corporate partners to dial in the security, big data/analytics and performance they need. With 15 patents pending, FHOOSH technology has been validated by the foremost cybersecurity response and assessment firm.

      CONTINUE READING


      Maggey Felix, Featured CyberTECH Advisor

      Maggey Felix specializes in Marketing and Operations with 5+ years of experience in the technology and cybersecurity industry. Her passion for cybersecurity and cutting-edge technologies is shown through her dedication to helping companies better prepare, organize and market their solutions.

      Over the past two years, Maggey has worked closely with the CyberTECH organization to support various marketing and operational activities. Her ongoing effort and commitment to CyberTECH makes Maggey an invaluable member of the community.

      We are proud to recognize Maggey Felix as the featured CyberTECH Advisor for July 2015.


      Julia Scholl, CyberTECH Director of Marketing and Operations

      Julia Scholl is a strategic and forward-thinking public relations and marketing professional with over five years of experience working with non-profit and startup organizations. A capable self-starter with excellent organizational and communication skills, Julia is passionate about building and fostering lasting relationships within the CyberTECH community.

      As Marketing and Operations Director, Julia will assist with daily operations, provide membership support as well as ongoing support with events, programs, and all other CyberTECH initiatives.

      We are excited to welcome Julia to CyberTECH. Please feel free to contact Julia directly at julia@cyberhivesandiego.org.



      Jessica Herrmann, CyberTECH Events Coordinator

      Jessica Herrmann has over 20 years of experience applying key leadership, communication and problem solving skills within the hospitality industry. As Catering and Events Manager, Jessica has worked with a number of organizations to develop, manage and execute top quality events.

      Jessica recently joined CyberTECH as Events Coordinator where she will help with the planning, organization, preparation and execution of CyberTECH events.

      Please join us in formally welcoming Jessica to the CyberTECH community.

      Upcoming Events
      Internet of Things (IoT) Meetup - September 17, 2015

      SAM Fest (Startups + Art + Music) - September 23-24, 2015

      IoT Startup Table Breakfast - October 13, 2015









      Check out their new brand video and you’ll see why Webpass isn’t just another ISP. They're leaders and innovators on a mission to change the way people think about the Internet. You can contact them at 1-800-Webpass!



      Step Inside Webpass!
      CyberTECH 2015 Newsletter Sponsor
      Webpass, a leading Internet service provider in the San Diego area is now delivering residential Internet connections at 100, 200, or 500 Mbps and business Internet connections from 10-1000 Mbps. There has never been a better time to cut the cable and switch to Webpass for your Internet needs! As the owner and operator of its Ethernet network, Webpass promises customers a simple urban Internet experience. They distinguish themselves from the competition through the simplicity of set-up, absence of contracts and personable customer service.

      Sign up today and instantly browse the Internet without modems, contracts or gimmicks.

      Webpass
      1360 5th Avenue
      San Diego, CA 92101
      1-800-Webpass 


      Facebook



      Twitter


      Website


      LinkedIn


      Email


      Instagram


      Get Involved!

      A key CyberTECH operating principle is collaboration. We are always looking to partner with individuals and organizations looking to get involved in various cyber and IoT initiatives throughout the region and across the globe. Opportunitites include event chair, volunteer, champion, program chair and more. For additional information on how you can support CyberTECH, please contact Julia Scholl.



      Join the CyberTECH Meetup Groups
      CyberTECH Cybersecurity Meetup

      CyberTECH Internet of Things Meetup



      Facebook


      Twitter


      Website


      Email


      LinkedIn

      Our mailing address is:
      1855 1st Ave. Suite 103, San Diego, CA 92101 

                 Data Lake, Big Data, NoSQL - The Good, The Bad and The Ugly         
      ...
                Â¿La próxima gran burbuja tecnológica estará en el trading?        
      Las tecnologías de trading están en terreno inestable, podrían convertirse en la próxima gran burbuja El big data es la clave para tener éxito en la tecnología del trading La tecnología hará que los mercados globales sean accesibles al público en general.
                Cleaning Big Data the Easy Way        

      Cleaning big data usually invokes big stress levels. With the advancements we’ve seen in technology over the years, many industries have been transformed from the very core. For academic researchers and data scientists, data has become more expansive and detailed than ever before. However, this has also affected business analysts and millions of others in […]

      The post Cleaning Big Data the Easy Way appeared first on Dataladder.


                A Few Facts on the Cost of Poor Data Quality        

      In the ever growing world of big data, the benefits of implementing a strong data quality and data governance program become more apparent. Having poor data quality can cost a company major financial loss, as well as reputation. In numerous surveys, IT directors and program managers agree that poor data quality is a major obstacle […]

      The post A Few Facts on the Cost of Poor Data Quality appeared first on Dataladder.


                Mastering Scribe Online – New Webinar Series        

      Technology products can change fast and Scribe’s integration platform as a service (iPaaS) is no exception.  Like most cloud services, Scribe’s iPaaS is upgraded on a continual basis with several releases each year.  Just over the past few months we have introduced an entirely new user experience, added support for new connectors (e.g., big data connectors for Amazon S3 and Amazon...

      The post Mastering Scribe Online – New Webinar Series appeared first on Blog :: Scribe Software.


                How to Benefit from Upgrading Your Digital Mindset        
      More technologies are simultaneously reaching maturity than at any other time in recent memory. Getting the most out of cloud, mobile, big data, IoT, machine learning, artificial intelligence and other maturing technologies will require organizations to open themselves to new ways of thinking.
                The Company Behind WordPress Is Shutting Its Office Because Too Many Employees Work Remotely        

      Automattic, the company behind WordPress, will be closing its San Francisco office—apparently because very few employees have been choosing to show up for work in person, Quartz reports.

      Automattic has long given its blessing to working remotely. Being in the office was always optional, and the company even provides financial support for employees to work from other locations. It offers employees $250 a month to use co-working offices, and if an employee works from a Starbucks, Automattic offers to pay for his or her drink. Now so few people are coming into the office that keeping it open just doesn’t seem to be worth the cost anymore.

      Tech companies take a wide variety of approaches to employees working remotely. In 2013, Yahoo CEO Marissa Mayer famously banned working from home. Mayer, who received a lot of criticism for the decision, said she didn’t want employees working offsite because people in the office were complaining that they rarely interacted with their remote counterparts. She also felt that people could be more collaborative if they were in the same place and cited the development of a weather app in conjunction with Flickr as an example of the benefits of working together in person.

      Similarly, in March, IBM, which was an early supporter of working from home, told employees from its U.S. marketing team who work remotely they would have to make it in person to one of six different locations or find a new job. Other departments, including security and procurement, had previously been told to work from offices. This is a surprising change for a company that embraced remote work in the ’80s and ’90s. It even announced in 2009 that having so many remote employees—40 percent of the company, to be exact—saved it about $100 million annually on office space in the U.S.

      Experts are a bit divided on the real-world results of remote working. In 2011, researchers from Harvard Medical School looked into the “water cooler effect”—the idea that employees can increase productivity by speaking to one another in an informal setting during their breaks.

      The study looked at 35,000 biomedical science papers involving 200,000 authors. They found that on average, papers were cited more when the first and last authors were in greater in-person contact. Papers with four or fewer authors in the same building were also cited more than papers whose writers were in separate locations.

      On the other hand, Nicholas Bloom, a professor of economics at Stanford, acknowledges that working from home has a negative reputation, but believes that research should disprove that. Bloom himself conducted research in 2014 on working remotely by studying workers from the call center of Ctrip, a Chinese travel website. For the study, workers were allowed to volunteer to work from home for nine months. Bloom found that the group that stayed home was 13 percent more productive, and quit rates among them were cut in half. The frequency with which people quit a job is important because the average firm has a 50 percent turnover rate and companies waste time and money trying to hire new people every year, Bloom said in an interview. “The key thing to note is anecdotes are great, but big data shows working from home is rising.”

      He’s right. According to a 2017 Gallup poll, 20 percent of U.S. employees work remotely 100 percent of the time. Tiny Pulse conducted a survey of 509 US employees who are permanent remote workers and found that they tend to be happier at work and feel more valued compared to the overall group of workers, though they report having “a lower relationship with their co-workers,” as Tiny Pulse puts it.


                Reading temporary table from another session        
      It happens to me at least once a week – I want to check progress of some heavy script that runs in chunks over big dataset and find out that it writes intermediate data to temporary table only. Last time it happened 3 days ago when I wanted to analyze...(read more)
                Katherine Glasheen is designing drones to think differently.        
      Katherine Glasheen

      Katherine GlasheenKatherine Glasheen has a nickname fit for an engineer: machine, and it is not just because it rhymes with her last name.

      A second year aerospace PhD student, she has a drive to advance technology, and is conducting research on socially aware drones, a project that will become increasingly important with wider adoption of UAVs. Today, however, it is something that is future focused enough that even her advisor calls it, "kind of wild."

      “The technology is developing faster than society can handle," Glasheen says. "One drone delivering a package in downtown Denver is challenging enough, but what about when there are hundreds of them? We need systems that are scalable and robust."

      Her ideas have already earned a major stamp of approval. She was recently awarded a prestigious National Science Foundation Graduate Research Fellowship.

      Kind of Wild

      Her proposal calls for using internet data to infer local attitudes about drones.

      "If the UAV could analyze news articles and comments on websites and knew people in the area it was traveling were uncomfortable with drones, it could deliberately avoid flying over places like schools, hospitals, and parks,” Glasheen says.

      It was an intriguing possibility for her advisor, Associate Professor Eric Frew.

      "The idea is to create an 'ethical drone' that understands and tries to respect local attitudes. It is a novel way to think about combining unmanned systems, big data, and artificial intelligence. I've not heard anyone suggest it before," Frew says.

      Less Artificial, More Intelligence

      To get there, Glasheen is first working on improving more conventional trip planning methods.

      "For a delivery drone, the path it's following is preplanned and loaded before it takes off. That doesn't account for any variables it could encounter on the way where it would need to change course," Glasheen says.

      What kind of variables? Think of the things you encounter driving to the grocery store. As humans, we can quickly react if a driver runs a red light. If we encounter traffic, we can take a different route.

      While some drones have rudimentary obstacle avoidance systems, a quick YouTube search will show they are less than ideal in execution. In addition, obstacle avoidance cannot account for things like encroaching stormy weather.

      Glasheen wants the drone to be able to change course and make adjustments midflight, but the AI is not the only problem. The kind of computer needed to process that much data is large and heavy, and would quickly turn a flying drone into a grounded paperweight.

      Drones in the Cloud

      That is where the cloud can come in come in. She sees a future where UAVs can regularly contact cloud systems to relay problems and determine solutions.

      "The drone has a small brain, but there's a big brain in the cloud. If the drone could ping the cloud and asks for help, you can get a solution to safely navigate through an environment," Glasheen says.

      The technology has great potential for the future. While delivery drones are often discussed as a public use, a UAV that can exchange data with the cloud could improve military reconnaissance and even weather forecasting.

      "It's all so exciting. The field is evolving every day and you can see new applications," Glasheen says. "A lot of it still unknown, which makes some people uncomfortable, but for me it's thrilling."

      news, Student Profile
                CASK Resource (Big Data)        
      TX-Irving, All applicants must have a minimum of 7+ years industry experience in order to apply. No 3rd party resumes accepted. LOCATION: Dallas, Texas DURATION: 6 months JOB RESPONSIBILITIES: We are doing a large Big Data project for a healthcare client in Dallas We need someone that can look at the data we have loaded now and map it to how we will have to setup and load it in Cask. Should be Cask on Horton
                My cf.Objective() 2013 Presentations        

      Other than noting back in January that all three(!) of my talk proposals were accepted, I haven't blogged about them since, so the only information about them is on the cf.Objective() web site. The session overviews give a fair sense of what you should get out of each presentation and roughly what they'll cover.

      Since I have just now finished the three presentations and got all the code working, I thought I'd write up some thoughts about the talks, to help folks who are on the fence decide 'for' or 'against'.

      • Learn You a What for Great Good?, subtitled Polyglot Lessons to Improve Your CFML, this talk looks at some idioms and constructs in JavaScript, Groovy and Clojure (with a brief mention of Scala too), and shows how you can apply them to CFML. A common thread thru the examples is closures, added in ColdFusion 10 and Railo 4, and the code examples make heavy use of them, showing map, reduce, and filter functions operating on CFML arrays and structs to provide many of the benefits that collections have been providing to other languages for quite some time. From JavaScript, I also look at prototype-based objects, and show how to implement that in CFML.
      • ORM, NoSQL, and Vietnam plays on blog posts by Ted Neward and Jeff Atwood to put Object-Relational Mapping under the microscope and look at where the mapping breaks down and how it can "leak" into your code, making your life harder. After that I take a quick walk thru the general "No-SQL" space and then focus on document-based data stores as a good (better?) match for OOP and provide examples based on MongoDB and cfmongodb, with a quick look at how common SQL idioms play out in that world.
      • Humongous MongoDB looks at replica sets, sharding, read preference, write concern, map/reduce and the aggregation framework, to show how MongoDB can scale out to support true "Big Data". The talk will feature a live demo of setting up a replica set and using it from CFML, including coping robustly with failover, and a live demo of setting up a sharded cluster (and using it from CFML) to show how MongoDB handles extremely large data sets in a fairly simple, robust manner.

      At the start of each of my talks, I have a "You Might Prefer..." slide listing the alternative talks you can attend if you don't fancy mine after you've seen the agenda slide - I won't be offended!

      The slides (and all code) will be available after the conference. I'll post the slides to my presentations page and the code will go up on my Github repository. If any user groups would like me to do remote presentations of these talks later in the year (and record them and post them to Charlie Arehart's UGTV site), just contact me to figure out dates and times.


                Is ‘Big Data’ Big Enough for IT Security?        
      We all have seen or heard about Big Data in various environments of science and technology such as meteorology, astronomy, physics simulations, demographic studies, Internet usage, and transaction analysis. Apparently, the government, military, and even the scientific community uses Big Data. So, what is so big about Big Data?
                MariaDB prend désormais en charge les analyses Big Data, la première version de son moteur dédié ColumnStore est disponible en téléchargement        
      La version printemps 2016 de MariaDB Entreprise est annoncée,
      La société éditrice de la base de données met en avant sa haute disponibilité et une sécurité renforcée

      MariaDB® Corporation a annoncé la version Printemps 2016 de MariaDB Enterprise. Cette nouvelle version de MariaDB est présentée comme ayant une sécurité plus renforcée par rapport aux versions précédentes. D'après la société, les nouvelles fonctionnalités qui ont été ajoutées à cette mouture protègent les données contre les attaques...
                Big Data统计学基础课程        
      数据科学并没有一个独立的学科体系,统计学,机器学习,数据挖掘,数据库,分布式计算,云计算,信息可视化等技术或方法来对付数据。但从狭义上来看,我认为数据科学就是解决三个问题: 1. data pre-processing; 2. data interpretation; 3.data modeling and analysis. 这也就是我们做数据工作的三个大步骤: 1、原始数据要经过一连串收集、提取、清洗、整理等等的预处理过程,才能形成高质量的数据; 2、我们想看看数据“长什么样&rdquo
                Is Big Data the Eyeballs of the Dot-Com Era?        
      Remember the halcyon days of the Dot-Com era? A frothy stock market, venture capital money flowing like water and famous sock puppets characterized the exuberance of the day. One company (Boo.com) spent $188 million in just six months to create an online fashion store. And 16 start-ups spent over $2 million each for a 30 […]
                Global Hadoop Market Size & Share Will Reach USD 87.14 billion by 2022: Zion Market Research        

      Sarasota, FL, Aug. 08, 2017 (GLOBE NEWSWIRE) -- Zion Market Research has published a new report titled "Hadoop Market by Type (Software, Hardware and Services) for BFSI, Government Sector, IT & ITES, Healthcare, Telecommunication, Retail and Other End-Uses: Global Industry Perspective, Comprehensive Analysis, Size, Share, Growth, Segment, Trends and Forecast, 2016 – 2022".  According to the report, the global Hadoop market was valued at approximately USD 7.69 billion in 2016 and is expected to reach approximately USD 87.14 billion by 2022, growing at a CAGR of around 50% between 2017 and 2022.

      Hadoop is an open source framework which is designed for storing and processing big data in a distributed environment across clusters of computers. To store and process the unstructured and structured data it uses simple programming models. It is designed to scale up from single servers to many of machines, each offering local computation and storage.

      Browse through 29 Market Tables and 32 Figures spread through 110 Pages and in-depth TOC on "Hadoop Market by Type (Software, Hardware and Services) for BFSI, Government Sector, IT & ITES, Healthcare, Telecommunication, Retail and Others End-Uses: Global Industry Perspective, Comprehensive Analysis, Size, Share, Growth, Segment, Trends and Forecast, 2016 – 2022".

      Request Free Sample copy of Global Hadoop Market Report @ https://www.zionmarketresearch.com/sample/hadoop-market

      Hadoop is useful for scalable storage platform as Hadoop can store and distribute the very large amount of data. Another benefit of Hadoop is cost effective storage solution for businesses. Additionally, Hadoop is flexible as with the help of businesses are enable to easily access new data sources and tap into different types of data like structured and unstructured for generating value from that data. ...

      Full story available on Benzinga.com


                Seven new academic programs coming to campus this fall        
      You might already know that IUPUI offers more than 350 undergraduate, graduate and professional programs.And come this fall, there will be a few more.Here’s a look at seven new academic programs from a variety of schools across campus:Ph.D. in data science, School of Informatics and Computing: This degree—the first of its kind in Indiana and in the Big Ten, and one of only a handful in the United States—leads to positions in academia as well as in industry. In fact, Glassdoor, a job and employment-recruiting website, ranks data scientist as the No. 1 job in America based on the number of job openings, salary and overall job-satisfaction rating. According to Glassdoor, the median base salary for a data scientist is $116,840.The field of data science involves collection, organization, management and extraction of knowledge and insights from massive, complex, heterogeneous data sets commonly known as "big data."Ph.D. in American studies, School of Liberal Arts: This nontraditional doctoral program looks to recruit students interested in exploring issues through a multidisciplinary approach, drawing on courses already being offered across the School of Liberal Arts. A doctoral internship of at least a year will help students translate their research into a variety of careers."The Ph.D. program in American studies at IUPUI does not tweak the traditional Ph.D. model, but rather builds an infrastructure for a collaborative and applied graduate school experience in order to close the distance between academia and the world that surrounds it," said Raymond Haberski Jr., professor of history and director of American studies.Graduate minor in communicating science, Department of Communication Studies, School of Liberal Arts: Scientists and health professionals today need to connect to and engage with the lay public, policymakers, funders, students and professionals from other disciplines. As a result, they find the need to tailor their communication for a variety of audiences. This program—designed for future scientists, including researchers and practitioners, who find themselves increasingly responsible for public speaking and writing—will increase students’ career prospects, help them secure funding and help them serve as effective teachers."The courses will offer more than public speaking and writing tips," said Krista Hoffmann-Longtin, assistant professor of communication studies in the School of Liberal Arts and assistant dean for faculty affairs and professional development in the School of Medicine. "Scientists will learn to improvise messages; to tell relevant stories; and to connect effectively with students, collaborators and funders."Liberal arts and management certificate, School of Liberal Arts: A 2013 study suggests that a liberal arts degree coupled with other skills can nearly double job prospects when those skills include marketing, business, data analysis and management—just to name a few."This certificate offers a course of study from both liberal arts and business to better prepare the 21st-century liberal arts graduate to respond to the challenges of a more complex world," said Kristy Sheeler, associate dean for academic programs in the School of Liberal Arts and a professor in the Department of Communication Studies. Contact Sheeler with questions about this new program.Doctor of public health in global health leadership, Richard M. Fairbanks School of Public Health: The school already knows what some students in this new program will do when they graduate: They’ll become state health commissioners; ministers of health; program officers; and mid- to senior-level managers in government agencies, foundations, nonprofits and nongovernmental organizations.That’s based on experiences of a similar program at the University of North Carolina at Chapel Hill. The person who helped design and lead that program is now at IUPUI: Sue Babich, associate dean of global health, director of the doctoral program in global health leadership, and professor of health policy and management.The degree prepares students to be leaders who can address the world’s challenging and complex public health issues. The three-year degree is a distance program, with classes delivered in real time via internet video. Students meet face-to-face three times each year in years one and two, and they complete dissertations in year three.Master of Science degree in product stewardship, Richard M. Fairbanks School of Public Health: The only academic degree available today designed to prepare students for leadership roles in the emerging field of product stewardship will train professionals to help businesses in a wide range of industrial fields navigate increasingly complex regulations as they advocate for the production of products in ways that ease regulatory compliance, minimize risks to people and the environment, and boost profitability.The online 30-credit-hour degree is expected to attract, among others, professionals who are already active in the product-stewardship field seeking formal training that will allow them to move up in their product-stewardship organizations and professionals from a wide range of other backgrounds, including environmental health, regulatory compliance, industrial hygiene, occupational health and safety, sustainability, product development, supply chain, and law.Master of Arts in teaching English to speakers of other languages (TESOL), Department of English, School of Liberal Arts: This 31-credit-hour degree provides both a strong theoretical foundation and hands-on practical experience to prepare national and international graduate students to become effective teachers of English to adult learners who speak other native languages, both in the United States and abroad.Working with IUPUI’s award-winning faculty, students will experience rich opportunities in teaching practica, including not only English for academic purposes but also English for specific purposes—for example, academic, legal, business and medical English. The program features a unique curricular strength in second-language research, materials preparation, curriculum design and the use of technology in second-language learning."It is thrilling to be able to launch the Master of Arts in TESOL at IUPUI," said Ulla Connor, director of the program. "This program is the culmination of TESOL and applied linguistics programming in the Department of English at IUPUI over the past 30 years. Our previous programs include the English for Academic Purposes Program for international students, which began in 1985; the International Center for Intercultural Communication, which started in 1998; and the Program for Intensive English that we began in 2015.”
                SAP Vora updates target business insights from Hadoop big data        
      SAP Vora has been updated to include features that make it easier to deploy and use to get insights from Hadoop big data; SAP IBP brings intelligence to the supply chain.
                Freedom of Expression?        
      A guest post by frequent contributor to this blog, Barry Eisler. I chime in midway.

      Barry sez: I just learned about an event put on by an organization called New America(formerly The New America Foundation): Amazon’s Book Monopoly: A Threat to Freedom of Expression? Ordinarily, propaganda is something that concerns me, but when it veers this far off into parody, I sometimes welcome it as a comic diversion.

      Because, come on, putting your tendentious conclusion right there in the title and disguising it as a question, while an impressively textbook instance of question-begging, in this context is also pretty funny. Because, “Hey, we’ve already established that Amazon is a monopoly; we’re just here to determine how much of a threat the company poses to Freedom and All That Is Good. Is it an existential threat, like Roger Cohen said about ISIS? Or merely an extremely threatening threat?”

      And who knows, maybe they’ll answer the question, “No,” right? Maybe the panelists will decide that Amazon’s “book monopoly” is actually a benefitto freedom of expression, as monopolies often are. It’s not as though they’ve structured things so that the question answers itself, and I don’t know why anyone would suspect this panel might be anything other than a diverse collection of open-minded people honestly engaging in free inquiry and the pursuit of knowledge wherever the facts may lead!

      Thanks to the efforts of serious-sounding organizations like New America (and if that vague but happy-sounding name didn’t cause your bullshit detector to at least tingle, it should—see also Americans for Prosperity and the Center for American Progress), this “Amazon is a Monopoly” silliness is so persistent that Joe and I dealt with it in our inaugural post on zombie memes—“arguments that just won’t die no matter how many times they’re massacred by logic and evidence.” Half the purpose of the Zombie Meme series is to save Joe and me from having to repeat ourselves, so if you want to have a laugh about why, despite its persistence, “Amazon is a Monopoly” is so embarrassingly dumb and misguided, here’s your link.

      But here’s the amazing part: “Amazon is a monopoly” is actually the cleverhalf of the event’s title. The really funny part is what follows: that Amazon poses a threat to freedom of expression!

      As I said in a previous Techdirt guest post called Authors Guilded, United, and Representing…Not:

      Given that Amazon’s self-publishing platform enables all authors to publish whatever they like and leaves it to readers to decide what books they themselves find beneficial, while the New York Big Five (no concentrated market power in a group with a name like that!) has historically rejected probably 999 books for every one they deem worthy of reaching the public, a few questions present themselves. Such as:

      •                     Who has really been “manipulating and supervising the sale of books and therefore affecting the exchange of ideas in America,” and who has really “established effective control of a medium of communication”—an entity that screens out 99.9% of books, or one that has enabled the publication of any book?

      •                     Who has really been running an uncompetitive, controlled, supervised, distorted market for books—a company dedicated to lower prices, or a group calling itself the Big Five that has been found guilty of conspiracy and price fixing?

      •                     Who is really restoring freedom of choice, competition, vitality, diversity, and free expression in the American book market—an entity that consigns to oblivion 999 books out of a thousand, or one that enables the publication of all of them?

      •                     And who is really ensuring that the American people determine for themselves how to take advantage of the new technologies of the 21st Century—an entity responsible for zero innovation and dedicated to preserving the position of paper, or one that has popularized a new publishing and reading platform that for the first time offers readers an actual choice of formats?

      Think about it. This “New America” organization has put together a panel dedicated to persuading you that there was more freedom of expression when an incestuous group of five Manhattan-based corporations held the power to disappear 999 books out every thousand written, and indeed performed that disappearance as the group’s core function (they call this “curation”). And that, now that Amazon’s KDP platform has enabled all authors to publish virtually anything they want, freedom of expression is being threatened.

       For an organization calling itself “New America,” these jokers sure seem wedded to the old version.

      In fairness to New America, I should note that their worldview is hardly unprecedented. The notion that the traditional way of doing things is ipso facto the best way of doing things was lampooned by Voltaire over 150 years ago through his character Dr. Pangloss, who was convinced (before experience in the world introduced doubts) that “All is for the best in this best of all possible worlds.” And Pangloss was himself based on the religious philosophy known as theodicy—a word coined over 300 years ago to describe a kind of faith that’s doubtless as old as the human race (and a word I admit I like because it sounds a bit like “idiocy”).

      In fact, it was as recent as, say, the 1950s that a group of tweed-jacketed, straight white male college professors were genuinely convinced that the collection of books they deemed the most intrinsically worthy—all, coincidentally, written by other straight white males—represented the maximally possible amount of valuable expression, information, and ideas. They even called their collection the “canon,” which I admit did tend to make their subjective choices sound important and even divinely ordained. As people came to question the absence of women and minority writers from this collection selected exclusively by straight white males, I imagine the straight white males genuinely believed that broadening the “canon” to include women and minorities was a threat to freedom of expression and all that. This is just the way a lot of people are wired, especially when status and privilege are part of the mix.

      And really, you do have to take a moment to applaud the mental gymnastics required of otherwise presumably intelligent people to say shit like “more authors writing more books reaching more readers is threatening freedom of expression, the flow of information, and the marketplace of ideas.” It’s War is Peace/Ignorance is Strength/Freedom is Slavery level doublethink. On the one hand, it’s sad, but on the other hand, in all the universe could there be a race as capable as humans of clinging so resolutely to faith in the face of so many contrary facts? Seen in this light, there’s something tragically beautiful about it.

      And while I admit that New America’s “day is night, black is white” bizarro worldview isn’t easy to parody, I can’t resist trying. So…

      Coming up next from New America: The Internet’s Dictatorial Grip: Impeding Access to Information? And The Tyranny of the Cell Phone: Shutting Down Communication? And Our Addiction to Paved Roads: A Threat to Freedom of Movement?

      One more thing about this event that’s unintentionally hilarious, and then I need to get back to something worthwhile (AKA, the new manuscript). Take a look at the guest list. If you hired a team of NASA scientists to design the most rabidly, incestuously anti-Amazon panel possible, this is pretty much the group the team would propose. Though I doubt even the scientists (assuming they had a little dignity) would have gone to far as to bring in Douglas Preston and his literary agent, Eric Simonoff. I mean, this is getting pretty close to just adding clones of existing panelists and eliminating the last fluttering fig leaf of diversity.

      They also have the dean of the Amazon Derangement crowd, Scott Turow. And Franklin Foer, who in fairness should be disqualified from even being on this panel because of his claim—in his much-derided “Let us kneel down before Amazon” screed—that “That term [monopoly] doesn’t get tossed around much these days, but it should”!

      By the way, I wouldn’t be surprised if Foer makes the same cringe-worthy claim again, on this very “Amazon is a Monopoly” panel. The anti-Amazon crowd has never been particularly educable.

      Also present will be Mark Coker, the head of Smashwords, an Amazon competitor. And author Susan Cheever, a member of Authors United, an organization that represents pretty much the platonic ideal of Amazon Derangement Syndrome. A couple of anti-trust lawyers to provide a veneer of legal gravitas (and to troll for clients, no doubt). And a second-year law student named Lina Khan who has argued that Amazon “should alarm us.”

      And that’s it. That’s as diverse and wide-ranging as the lineup gets. The full gamut of viewpoints, from A…all the way to B.

      Although really, even that feels a little generous.

      Oh, by the way, Eric Schmidt, Executive Chairman of Google, another Amazon competitor, is the chairman of New America’s board of directors, too. No conflict of interest there. Nothing to disclose to anyone who might think this is some sort of disinterested, scholarly event.

      So yeah, it’s really that much of a hive-mind lineup. But that’s not even the best part. The best part is, this remarkably insular and incestuous exercise in groupthink has been assembled to speak out against a purported threat to…freedom of expression! The flow of information! And the marketplace of ideas!

      None of this is an accident, by the way. It isn’t just stupidity and incompetence. There’s a reason organizations will try to take a narrow outlook and propagate it through multiple mouthpieces: doing so can create the impression that a rare and radical notion is in fact widely held—held even by ostensibly disparate groups—and therefore more trustworthy. Indeed, this form of propaganda is a favorite of some of the same reactionary groups New America is showcasing on its panel. As I said recently about the supposedly “unprecedented joint action” of some booksellers, authors, and agents complaining together about Amazon:

      Which brings us to the second revealing aspect of this “propaganda masquerading as an interview” drill. You see, in the standard “blow-job masquerading as interview” gambit, it’s generally enough to hope the reader will just assume the interviewer and interviewee are working at arms-length. Making the point explicitly isn’t really the done thing. Here, however, perhaps not trusting readers to be sufficiently gulled, the ABA and AG are at pains to describe the “unprecedented joint action” of the AG, Authors United, the ABA, and the Association of Authors’ Representatives in going after Amazon for monopolizing the marketplace of ideas, devaluing books, and generally crushing dissent, democracy, and all that is good. The impression they’re trying to create is, “Wow, if so many separate organizations hate Amazon, Amazon must be doing something bad.”

      But what’s critical to understand is that the most fundamental purpose of the Authors Guild, Authors United, the American Booksellers Association, and the Association of Authors is to preserve the publishing industry in its current incarnation. Whatever marginal differences they might have (I’ve never actually seen any, but am happy to acknowledge the theoretical possibility) are eclipsed by this commonality of purpose. Under the circumstances, the fact that these four legacy publisher lobbyists agree on something is entirely unremarkable (indeed, what would be remarkable would be some evidence of division). But if people recognize the exercise as a version of “No really, I read it somewhere…okay, I wrote it down first,” the propaganda fizzles. And that’s why these propagandists have to nudge readers with the bullshit about the “unprecedented joint action.” Otherwise, when Authors Guild Executive Director Mary Rasenberger cites Authors United pitchman Doug Preston as though Preston were a separate, credible source, people might roll their eyes instead of nodding at the seriousness of it all. They might even giggle at the realization that all those “When did Amazon stop beating its wife?” questions were functionally being put by Rasenberger to herself.

      So no, this wasn’t remotely a cross-examination, or even a cross pollination (indeed, publisher lobbyists are expert at fleeinganything that offers even the slightest whiff of actual debate—which does make their alleged devotion to the Free Flow of Ideas and Information as the Engine of Democracy worthy of a smile, at least, if nothing else). It was just a stump speech lovingly hosted by someone else’s blog. The sole reason for the exercise was to create the misleading appearance of multiple, arms-length actors when functionally there is only one.

      In fairness to the aforementioned Unprecedentedly Joint Actors, there is a rich heritagebehind this form of propaganda. For example, in the run-up to America’s second Iraq war, Dick Cheney would have someone from his office phone up a couple of pet New York Timesreporters, who would then dutifully report that anonymous administration officials believed Saddam Hussein had acquired aluminum tubes as part of his nuclear weapons efforts…and then Cheney would go on all the Sunday morning talk shows and get to say, “Don’t take my word for the aluminum tube stuff—even the New York Times is reporting it!”

      So leave aside the fact that the “joint action” in question is anything but unprecedented—that it is in fact publishing establishment SOP. Anyone familiar with the record of these organizations will instantly realize that the “unprecedented joint action” in question is a lot like the “joint action” of all four fingers—plus the thumb!—of someone throwing back a shot of tequila. Like that of a little boy pleasuring himself—with both hands!—and trying to convince anyone who will listen that the Unprecedented Left and Right Action is proof that “Everybody loves me!”

      Okay, I apologize for the multiple excerpts from previous posts. But what are you going to do? These bloviators keep vomiting up the same tired bullshit, no matter how many times it’s debunked. It just saves time to refer to the previous debunkings rather than typing it all out again.

      My advice to New America? If you’re more than just a propaganda operation—if you really do care about freedom of expression, and the flow of information, and the marketplace of ideas—you might want to add at least a token panelist with a viewpoint that differs even just a tiny bit from that of the nine Borg you’ve assembled to intone that Amazon Is Evil and Will Destroy All That Is Good. Otherwise, your event is going to feel more like a circle jerk and less like sex. And, doubtless, with similarly productive results.

      Joe sez: And just when I think I’m out…

      Thanks, Barry, for turning a spotlight on this silliness, and patiently picking apart why it is so silly. I’m sure the panel will be a resounding success, much like all circle jerks and echo chambers are for those involved. Masturbation is supposed to be satisfying, and a nice “atta boy!” and backslap at the finish seems preferable to eating the soggy biscuit.

      Don’t Google that if you don’t know what I mean. You can’t unlearn it.

      One of the reasons I’ve largely eschewed activism lately is because I haven’t seen any ill effects from all the Amazon bashing being done by the usual spin doctoring suspects.

      At the risk of invoking Godwin’s Law, the propaganda classic Triumph of the Will was just released on BluRay for the first time. It’s an effective piece of filmmaking, and Frank Capra imitated a lot of elements from it for his Why We Fight series.

      It worked. And it is still being imitated today, both as a film, and as propaganda. Fear mongering is an old standby for getting people on your side. I wrote a whole post about alarmist terminology and spin

      But I don’t think this approach works when it comes to Amazon. People aren’t so ready to buy what the pinheads are selling. Today we can have the New York Times, which I believe still has the motto “All the news that’s fit to print”, show such stunning anti-Amazon bias that the public editor has called it out more than once, and the public simply doesn’t give a shit. Amazon still gets their approval and their business, no matter how many times David Streitfeld one-finger-types his screeds while busting out knuckle babies with his other hand.

      The public likes Amazon. Even if it were true that Amazon is planning to overthrow the government and replace the Bill of Rights with a guarantee of same day free shipping, its approval rating is so high that I don’t think most folks would care.

      But for all the alarmist rhetoric and soothsaying predictions of world domination, I’ve yet to see anyone other than Big 5 apologists and their NY media cronies show much concern over Amazon’s mounting dominance of online retail.

      Maybe that’s because—wild guess here—Amazon offers authors unprecedented opportunity to reach readers, and offers readers the widest selection at the lowest possible prices coupled with good customer service.

      Authors United, and the NYT, are doing everything they’re supposed to be doing to spread their anti-Zon propaganda, but the people don’t care.

      If I had faith in human nature, I’d posit that access to the Internet (and the ability for anyone with second grade spelling skills to type words into a search engine) can reveal in a click or two what utter nonsense the morons are spouting.

      But I think the more realistic answer is that people simply like Amazon because it has a wide selection, low prices, and good customer service.

      So I no longer feel the need to correct the greedy, self-interested 1% of authors who want to prop up an archaic, inefficient, and ruthless publishing industry with stupid organizations and articles and events. Joe Average might very well read about this panel in a Streitfeld spat of “journalism”, cluck his tongue at how Amazon is destroying freedom of expression, and then quickly forget about it when the UPS guy knocks on the door with a box of Bounty because yesterday Joe used his Amazon Dash button to order more.

      The legacy publishing industry is dying. Once it lost its lock on distribution, it lost the majority of its power. The only ones who will mourn that industry are the few handfuls of authors it made rich. And when their corporate masters merge and downsize into inevitable bankruptcy, watch how quickly they jump on Amazon’s teat when the seven figure advances are gone.

      But, for old times’ sake, let me fisk New American’s event description. Their nonsense in italics, my replies in regular font.

      Amazon dominates the U.S. book market to a degree never before seen in America.

      But does it dominate the U.S. book market to a degree never before seen in Canada?

      Okay, I’m making fun of the lousy sentence, but isn’t that like saying “In my house I dusted the bookcases to a degree never before seen in my house?”

      That's silly. Especially since I switched to ebooks and got rid of my bookcases.

      This corporation dominates every key segment of the market.

      Wow, that's a lot of dominance. I hope the public has a safeword.

      We had a cartel dominating publication and distribution for decades. It was an oligopoly called the Big 6. Not only did it reject a high percentage of books submitted to it—which can be argued is a form of censorship—but when it accepted a book it fucked the author in the ass with unconscionable, one-sided contract terms. Terms even the Big 5 enamored Authors Guild has spoken out against.

      And this immense size gives Amazon unprecedented power to manipulate the flow of books – hence of information and ideas – between author and reader.

      OK, reread what Barry and I have written here. For over a hundred years, publishers have refused to publish the overwhelming majority of books, essentially preventing the public from ever reading them. They had a right to do that, just like Chick-Fil-A has a right to be closed on Sundays for ridiculous religious reasons.

      But unlike the Big 6, or Chick-Fil-A, Amazon is allowing more traffic than ever before. More books are flowing with Amazon than flowed with the Big 6.

      Plus, Amazon isn’t a monopoly, and doesn’t control the Internet, so if there were cases where Amazon decides it doesn’t want to sell something, it can’t prevent it from being sold elsewhere.

      Last summer a group of authors made the case that Amazon’s actions constitute an abuse of its monopoly powers and threatens this vital marketplace of ideas.

      It was a shitty case. But let’s not allow facts to get in the way of good propaganda. Because if you keep repeating the same lie, some people are bound to start believing it. 

      Unless they're Prime members. Then they'll cluck their tongues and ask Alexa to pre-order the new Barry Eilser book,

      Amazon’s actions, they wrote, may already be affecting what authors write and say.

      As evidenced by Amazon refusing to sell any work by any signatory of Authors United.

      Oh… wait.

      But look how Amazon has forced writers to cower in the shadows, fearful of offering any sort of critique.

      Oh… wait.

      Hmm. Doesn’t a panel about Amazon restricting freedom of expression prove that Amazon can’t restrict freedom of expression? Or if it can, doesn’t want to?

      Oops, my bad. They used the word "may". So it could read "may already be affecting what authors write and say, even though there is no evidence or logic to support that conclusion." Like someday I "may" own my own country, which I'll name Joetopia and make our main export beer parties. If you'd like Joetopia to export a beer party to you, let me know because it "may" happen. Wait by the phone until you hear back.

      The authors strongly urged antitrust regulators to take action, in what would be the most important antitrust case since Microsoft in the late 1990s.

      Except for the tiny little fact that, you know, THERE IS NO CASE.

      Barry and I take a lot of time to add these links to prove out points. You diligent readers are clicking on them, right?

      Join New America’s Open Markets program for a discussion of Amazon’s monopoly over books and what it means for American readers and America’s democracy.

      For God’s sake, someone think of the children! Because an online retailer is all that stands between the freedom to vote for representatives in government (that's the definition of democracy), and a zombie world where neighbors feast on neighbors and the only law comes from the business end of a twelve gauge. Because that argument makes as much sense as theirs.

      Some of the nation’s best-known authors will discuss their personal experiences with Amazon.

      And nary a one with a contrary point of view! Perhaps because they couldn't find any author with a good personal experience with Amazon. I mean, other than a hundred thousand or four. But I'm sure New America has much better things to do with their time than a little research.

      Antitrust lawyers and experts in Big Data and price discrimination will then discuss the larger effects of the corporation’s behavior, and whether the government should bring a case against Amazon.

      With Data so Big it’s Capitalized! Did that become a thing and I missed it?

      And what could they possibly say in regard to price discrimination? Amazon fights to keep prices low. The Big 6 fight to keep them high. They illegally collude to keep them high. They print the prices on their damn books to keep them high.

      Could they be going into the nefarious business practice of co-op, and Amazon charging publishers for better visibility? Is that the discrimination they mean? Or maybe loss leads?

      Last I checked, both were not only legal, but commonplace in retailers.

      I wonder what the antitrust lawyers will say about Amazon allowing anyone to sell through Amazon. In other words, if Amazon decided it no longer wanted to sell Big 6 titles, I could open up an Amazon seller account and sell Big 6 titles on Amazon. Can someone explain to me how that limits the flow of books between reader and author?

      Follow the discussion online using #BookMonopoly and follow us @NewAmerica.

      No thanks. But here's a hashtag you can follow: #StoptheStupid.

      Lunch will be provided.

      And it will be the only substantive thing offered that afternoon.

      Now I’m going back to my WIP. When the NYT write-up of this stupid event runs, I’m going to ignore it.

                Dataism: Getting out of the 'job loop' and into the 'knowledge loop'        
      From deities to data - "For thousands of years humans believed that authority came from the gods. Then, during the modern era, humanism gradually shifted authority from deities to people... Now, a fresh shift is taking place. Just as divine authority was legitimised by religious mythologies, and human authority was legitimised by humanist ideologies, so high-tech gurus and Silicon Valley prophets are creating a new universal narrative that legitimises the authority of algorithms and Big Data." Privileging the right of information to circulate freely - "There's an emerging market called Dataism, which venerates neither gods nor man - it worships data. From a Dataist perspective, we may interpret the entire human species as a single data-processing system, with individual humans serving as its chips. If so, we can also understand the whole of history as a process of improving the efficiency of this system... Like capitalism, Dataism too began as a neutral scientific theory, but is now mutating into a religion that claims to determine right and wrong... Just as capitalists believe that all good things depend on economic growth, so Dataists believe all good things - including economic growth - depend on the freedom of information." Our unparalleled ability to control the world around us is turning us into something new - "We have achieved these triumphs by building ever more complex networks that treat human beings as units of information. Evolutionary science teaches us that, in one sense, we are nothing but data-processing machines: we too are algorithms. By manipulating the data we can exercise mastery over our fate." Planet of the apps - "Many of the themes of his first book are reprised: the importance of the cognitive revolution and the power of collaboration in speeding the ascent of Man; the essential power of myths — such as religion and money — in sustaining our civilisations; and the inexcusable brutality with which our species treats other animals. But having run out of history to write about, Harari is forced to turn his face to the future... 'Forget economic growth, social reforms and political revolutions: in order to raise global happiness levels, we need to manipulate human biochemistry'... For the moment, the rise of populism, the rickety architecture of the European Union, the turmoil in the Middle East and the competing claims on the South China Sea will consume most politicians' attention. But at some time soon, our societies will collectively need to learn far more about these fast-developing technologies and think far more deeply about their potential use." also btw...
      • Preparing for our Posthuman Future of Artificial Intelligence - "By exploring the recent books on the dilemmas of AI and Human Augmentation, how can we better prepare for (and understand) the posthuman future? By David Brin." (omni o)
      • The Man-Machine Myth - "Beliefs inspired by the cybernetic mythos have a quasi-theological character: They tend to be faith-based."
      • Unsettling thought of the day
      • Each technological age seems to have a "natural" system of government that's the most stable and common... Anyway, now we've entered a new technological age: the information age. What is the "natural" system of government for this age?

        An increasing number of countries now seem to be opting for a new sort of illiberal government - the style of Putin and the CCP. This new thing - call it Putinism - combines capitalism, a "deep state" of government surveillance, and social/cultural fragmentation.

        It's obviously way too early to tell, but there's an argument to be made that Putinism is the natural system of government now. New technology fragments the media, causing people to rally to sub-national identity groups instead of to the nation-state.

        The Putinist "deep state" commands the heights of power with universal surveillance, and allies with some rent-collecting corporations. Meanwhile, IF automation decreases labor's share of income and makes infantry obsolete, the worker/soldier class becomes less valuable.

        "People power" becomes weak because governments can suppress any rebellion with drones, surveillance, and other expensive weaponry. Workers can strike, but - huge hypothetical assumption alert! - they'll just be replaced, their bargaining power low due to automation.

        In sum: Powerful authoritarian governments, fragmented society, capitalism, "Hybrid warfare", and far less liberty.
      • The Totalitarian - "Putinist models seem to curtail personal freedom and self-expression. Chases away innovation class. In the long run this makes them unable to keep up with more innovative, open societies. But innovative open societies are also fissiparous in the long run. They need a strong centralized, even authoritarian, core. To wit the big democracies also have deep states, just ones that infringe on domestic public life less than Putinist do. Automation makes mass citizenry superfluous as soldiers, workers or taxpayers. The insiders' club is ever-shrinking. Steady state of AI era is grim. One demigod and 10 billion corpses/brain-in-jars depending on humanism quotient of the one. The three pillars for this end state are strong AI, mind uploading/replication, and mature molecular nanotechnology."
      • Capitalism and Democracy: The Strain Is Showing - "Confidence in an enduring marriage between liberal democracy and global capitalism seems unwarranted."
      • So what might take its place? One possibility[:] ... a global plutocracy and so in effect the end of national democracies. As in the Roman empire, the forms of republics might endure but the reality would be gone.

        An opposite alternative would be the rise of illiberal democracies or outright plebiscitary dictatorships... [like] Russia and Turkey. Controlled national capitalism would then replace global capitalism. Something rather like that happened in the 1930s. It is not hard to identify western politicians who would love to go in exactly this direction.

        Meanwhile, those of us who wish to preserve both liberal democracy and global capitalism must confront serious questions. One is whether it makes sense to promote further international agreements that tightly constrain national regulatory discretion in the interests of existing corporations... Above all... economic policy must be orientated towards promoting the interests of the many not the few; in the first place would be the citizenry, to whom the politicians are accountable. If we fail to do this, the basis of our political order seems likely to founder. That would be good for no one. The marriage of liberal democracy with capitalism needs some nurturing. It must not be taken for granted.
      • G20 takes up global inequality challenge - "Even before the final communiqué is drafted for the annual G20 summit the leaders of the world's largest economies already seemed to agree on their most pressing priority: to find a way to sell the benefits of globalisation to an increasingly sceptical public. As they arrived in the Chinese city of Hangzhou over the weekend, many were on the defensive amid a welter of familiar complaints back home: frustratingly slow growth, rising social inequality and the scourge of corporate tax avoidance."
      • "Growth drivers from the previous round of technological progress are fading while a new technological and industrial revolution has yet to gain momentum," Mr Xi said at the start of the G20, adding that the global economy was at a "critical juncture".

        "Here at the G20 we will continue to pursue an agenda of inclusive and sustainable growth," Mr Obama said, acknowledging that "the international order is under strain".

        Mr Xi, whose country has arguably benefited more than any other from globalisation, struck a similarly cautious note in a weekend speech to business leaders. In China, he said, "we will make the pie bigger and make sure people get a fairer share of it".

        He also recognised global inequity, noting that the global gini coefficient — the standard measure of inequality — had raced past what he called its "alarm level" of 0.6 and now stood at 0.7. "We need to build a more inclusive world economy," Mr Xi said.
      • G20 leaders urged to 'civilise capitalism' - "Chinese president Xi Jinping helped set the tone of this year's G20 meeting in a weekend address to business executives. 'Development is for the people, it should be pursued by the people and its outcomes should be shared by the people', Mr Xi said... Before the two-day meeting, the US government argued that a 'public bandwagon' was growing to ditch austerity in favour of fiscal policy support. 'Maybe the Germans are not absolutely cheering for it but there is a growing awareness that 'fiscal space' has to be used to a much greater extent', agreed Ángel Gurría, secretary-general of the Organisation for Economic Cooperation and Development."
      • Martin Wolf calls for basic income, land taxation & intellectual property reform: Enslave the robots and free the poor
      • The rise of intelligent machines is a moment in history. It will change many things, including our economy. But their potential is clear: they will make it possible for human beings to live far better lives. Whether they end up doing so depends on how the gains are produced and distributed. It is possible that the ultimate result will be a tiny minority of huge winners and a vast number of losers. But such an outcome would be a choice not a destiny. A form of techno-feudalism is unnecessary. Above all, technology itself does not dictate the outcomes. Economic and political institutions do. If the ones we have do not give the results we want, we must change them.
      • From the Job Loop to the Knowledge Loop (via Universal Basic Income) - "We work so we can buy stuff. The more we work, the more we can buy. And the more is available to buy, the more of an incentive there is to work. We have been led to believe that one cannot exist without the other. At the macro level we are obsessed with growth (or lack thereof) in consumption and employment. At the individual level we spend the bulk of our time awake working and much of the rest of it consuming."
      • I see it differently. The real lack of imagination is to think that we must be stuck in the job loop simply because we have been in it for a century and a half. This is to confuse the existing system with humanity's purpose.

        Labor is not what humans are here for. Instead of the job loop we should be spending more of our time and attention in the knowledge loop [learn->create->share]... if we do not continue to generate knowledge we will all suffer a fate similar to previous human societies that have gone nearly extinct, such as the Easter Islanders. There are tremendous threats, eg climate change and infectious disease, and opportunities, eg machine learning and individualized medicine, ahead of us. Generating more knowledge is how we defend against the threats and seize the opportunities.
      • What's more scarce: money, or attention? - "Attention is now the scarce resource."

                Día de Big Data 2017        
      Webinar gratuito para tratar la oportunidad de Big Data, el cálculo de valor, y las capacidades necesarias para sacar provecho a la tecnología
                Size Matters: Empirical Evidence of the Importance of Training Set Size in Machine Learning        
      There is much hype around "big data" these days - how it's going to change the world - which is causing data scientists to get excited about big data analytics, and technologists to scramble to understand how they employ scalable, distributed databases and compute clusters to store and process all this data. Interestingly, Gartner dropped "big [...]
                Maxwell Health Reduces Cloud Storage Costs and Recovery Times with Datos IO RecoverX for Cloud Backup and Recovery        
      Datos IO’s RecoverX platform solves operational challenges and reduces total cost of ownership for Maxwell’s SaaS based platform on MongoDB databases deployed on Amazon AWS public cloud SAN JOSE, Calif., Aug. 10, 2017 — /BackupReview.info/ — Datos IO, the application centric cloud data management company, today announced that Maxwell Health, an HR and benefits technology [...] Related posts:
      1. Datos IO Extends RecoverX to Meet Backup and Recovery Needs for Cloud-Native Workloads On Amazon Web Services
      2. TechTarget Names Datos IO RecoverX Storage Product of the Year Finalist
      3. Datos IO RecoverX Selected as Product of the Year 2016 by Storage Magazine
      4. Datos IO Introduces RecoverX, Industry-First Scale-Out Data Protection Software for Cloud Native and Big Data Environments
      5. Datos IO Teams with NetApp to Deliver Transformational All-Flash Storage and Cloud Data Protection Solution for Next-Generation Data Center Applications

                Database Sharding the Right Way: Easy, Reliable, and Open source - HighLoad++ 2012        

      The presentation the CUBRID team presented at Russian HighLoad++ Conference in October, 2012. The presentation covers the topic of Big Data management through Database Sharding. CUBRID open source RDBMS provides native support for Sharding with load balancing, connection pooling, and auto fail-over features.
                 The Environmental Data Abstraction Library (EDAL): a modular approach to processing and visualising large environmental data         
      Griffiths, G. , Haines, K. , Blower, J. , Lewis, J. and Lin, N. (2014) The Environmental Data Abstraction Library (EDAL): a modular approach to processing and visualising large environmental data. In: 2014 conference on Big Data from Space (BiDS’14), 12-14 November 2014, Frascati, pp. 97-100.
                CMU Delegation at World Economic Forum in China        

      Image of the World Economic Forum sign in Dalian China

      By Heidi Opdyke

      Illah Nourkbahsh in 2015
      Robotics Professor Illah Nourbakhsh leads a discussion on Asia’s Industrialization using visualizations created by his CREATE Lab from Landsat imagery in 2015 at the World Economic Forum's Annual Meeting of the New Champions.

      Carnegie Mellon University researchers and scientists will play an important role in global discussions at the World Economic Forum's Annual Meeting of the New Champions, June 27-29, in Dalian, China.

      Often called "Summer Davos," to differentiate it from the forum's annual winter meeting in Switzerland, the meeting brings together world leaders in business science, technology, innovation and politics. This year's theme is "Achieving Inclusive Growth in the Fourth Industrial Revolution."

      CMU experts have since 2011 led conversations at the World Economic Forum in fields ranging from robotics to artificial intelligence. CMU scientists often lead discussions, give talks, demonstrate technology and provide their distinctive expertise.

      This year's CMU delegation includes:

      • Erica Fuchs, professor of engineering and public policy;
      • Madeline Gannon, a research fellow with the Frank-Ratchye STUDIO for Creative Inquiry;
      • James McCann, assistant professor in the Robotics Institute;
      • Tom Mitchell, the E. Fredkin University Professor in the Machine Learning Department;
      • Illah Nourbakhsh, professor of robotics; and
      • Gabriel O'Donnell, principal research programmer and analyst in the Robotics Institute.

      Image of the World Economic Sign in Dalian China

      CMU will host a panel discussion called "The Future of Production with Carnegie Mellon University," in which Fuchs, Gannon and McCann will discuss rethinking behavior and purpose of industrial robots beyond factory floors, reimagining how large companies can integrate disruption themselves, and reconfiguring how automation collides with human skills.

      Nourbakhsh and O'Donnell will make multiple presentations at the Global Situation Space exhibition. The presentations combine NASA time-lapse satellite imagery and geospatial and econometric data with predictive modelling to explore issues such as emerging megacities, man-made changes to the oceans and trade with China.

      Nourbakhsh's CREATE Lab and its spinoff BirdBrain Technologies will be part of a workshop on building interactive sculptural robots. He will contribute to sessions on the fourth industrial revolution, the digital economy, the creative economy and platforms for artificial intelligence.

      Mitchell will participate in a panel discussion about how the social safety net can respond to the fourth industrial revolution. He recently co-chaired a study of the future work for the National Academies of Sciences, Engineering and Medicine. He will present a session on how big data can affect policymaking.

      Madeline Gannon working with a robot
      Madelyn Gannon works with industrial robots and is working to invent better ways to communicate with machines.

      Gannon was one of 20 researchers selected to the World Economic Forum's Cultural Leaders advisory community. As part of the programming, she will be participating in sessions that discuss the impact of human-centered robotics on the future of work.

      Three Named Young Scientists

      CMU faculty members Laura Dabbish, an associate professor in the Human-Computer Interaction Institute with a joint appointment in the Heinz College of Information Systems and Public Policy; Louis-Philippe Morency, an assistant professor in the Language Technology Institute; and Tim Verstynen, an assistant professor of psychology, have been named 2017 Young Scientists by the World Economic Forum.

      Fifty-two scientists under the age of 40 are recognized this year for exhibiting exceptional creativity, thought leadership and high growth potential, and will be at the Dalian conference.

      CMU is one of only 27 universities in the world, 12 in the U.S., that make up the Global University Leaders Forum (GULF), which provides a unique platform for the world's top universities to discuss higher education and research while helping to shape the World Economic Forum agenda. GULF fosters discussion on global policy issues between member universities, the business community and a broad range of stakeholders.


                7 técnicas para redução da dimensionalidade        
      Publicado originalmente em Data Mining / Machine Learning / Data Analysis:
      Na atual era do Big Data em que o custo de armazenamento praticamente foi levado ao nível de commodity, muitas corporações que se gabam que são ‘adeptas’ do Big Data acabam pagando/armazenando ruído ao invés de sinal. Pelo motivo exposto acima, diante do prisma de Engenharia de Dados o…
                Dez principais tendências de Big Data para 2017        
      Fonte: Tableau Software O ano de 2016 foi um marco para o Big Data, com mais organizações armazenando, processando e extraindo valor de dados de todos os formatos e tamanhos. Em 2017, os sistemas que oferecem suporte a grandes volumes de dados estruturados e não estruturados continuarão crescendo. Haverá uma demanda de mercado por plataformas […]
                Thousands of New Customers Improve Business and Gain Insight with Oracle Analytics Cloud        
      Press Release

      Thousands of New Customers Improve Business and Gain Insight with Oracle Analytics Cloud

      Oracle delivers industry’s most comprehensive cloud analytics platform featuring new self-learning analytics

      Redwood Shores, Calif.—Aug 10, 2017


      Connecting people to the information they need through the power of cloud technology, Oracle today announced that Oracle Analytics Cloud is experiencing significant growth with thousands of organizations subscribing to the service globally. In addition to tripling adoption over the last 12 months, nearly 75 percent of customers are new to Oracle Analytics Cloud, ranging from small and medium sized businesses accessing enterprise-class analytics for the first time to large organizations modernizing their analytics platforms. Arlington Orthopedic Associates, Outfront Media, and Skanska AB are among those using Oracle Analytics Cloud to identify new savings, help increase their return on investment, and fuel innovation.

      In response to this rapid growth, Oracle released a new version of Oracle Analytics Cloud earlier this year, extending its breadth and depth with new capabilities such as user-driven scenario modeling, next-generation mobile and social analytics, and complete customer control over their cloud environment. 

      “Oracle Analytics Cloud makes it easy for customers to gain new insights and reap the rewards of digital transformation by offering the speed, scale, power, and flexibility organizations need in a single platform,” said Rich Clayton, vice president of analytics product strategy, Oracle.  “Customers clearly understand the value, which is driving strong growth across the board – in our base, with new customers, and in utilization, which is the highest of any Oracle Platform as a Service offering.”

      Oracle Delivers Most Comprehensive Cloud Analytics Platform

      Oracle Analytics Cloud provides the industry’s most comprehensive cloud analytics in a single platform, including everything from self-service visualization and powerful inline data preparation to enterprise reporting, advanced analytics, and self-learning analytics that deliver proactive insights. With support for more than 50 data sources and an extensible, open framework, Oracle Analytics Cloud gives customers a complete, connected, collaborative platform that brings the power of data and analytics to every process, interaction, and decision. 

      John Cronin, Group CIO at An Post, explained that Ireland’s postal service chose Oracle Analytics Cloud “to extend and integrate into our existing big data analytics. This modern, agile, platform has enabled us to readily externalize our existing analytics and share insights with key customers.”

      As part of this release, Oracle introduced an innovative new service, Oracle Analytics Cloud Day by Day. It is the first enterprise analytic application delivering proactive analytics to mobile devices based on business updates and personal preferences, ensuring the right information is always available, without customers even having to ask for it. Oracle Analytics Cloud Day by Day is complemented by a native mobile application, Oracle Analytics Cloud Synopsis, which enables anyone to visually analyze files on their mobile devices and then combine those insights with business information in Oracle Analytics Cloud Day by Day. The Oracle Analytics Cloud Synopsis app is available for free to all mobile users from the App Store for iPhone and iPad, and Google Play™ Store.

      “One of our goals is to help our customers take advantage of cloud analytics,” said Francisco Tisiot, principal consultant at Rittman Mead.  “Oracle Analytics Cloud provides complete and elastic business intelligence, and is customizable and manageable by customers, all in the Oracle Cloud.”


      Contact Info
      Jesse Caputo
      Oracle PR
      +1.650.506.5967
      jesse.caputo@oracle.com
      Kristin Reeves
      Blanc & Otus
      +1.415.856.5145
      kristin.reeves@blancandotus.com
      About Oracle

      The Oracle Cloud offers complete SaaS application suites for ERP, HCM and CX, plus best-in-class database Platform as a Service (PaaS) and Infrastructure as a Service (IaaS) from data centers throughout the Americas, Europe and Asia. For more information about Oracle (NYSE:ORCL), please visit us at www.oracle.com.

      Trademarks

      Oracle and Java are registered trademarks of Oracle and/or its affiliates. Other names may be trademarks of their respective owners.


      Talk to a Press Contact

      Jesse Caputo

      • +1.650.506.5967

      Kristin Reeves

      • +1.415.856.5145

                Oracle Significantly Expands Cloud at Customer with PaaS and SaaS Services to Help Customers in their Journey to the Cloud        
      Press Release

      Oracle Significantly Expands Cloud at Customer with PaaS and SaaS Services to Help Customers in their Journey to the Cloud

      Delivers unrivaled enterprise-grade public cloud SaaS, PaaS, and IaaS services in customers’ datacenters

      Redwood Shores, Calif.—Jul 19, 2017


      Empowering organizations to move workloads to the cloud while keeping their data on their own premises, Oracle today announced significant expansion of the breadth of services available through Oracle Cloud at Customer. The portfolio now spans all of the major Oracle PaaS categories and for the first time, also features Oracle SaaS services. Since its introduction just over a year ago, Oracle Cloud at Customer has experienced unprecedented growth with leading global organizations across six continents and more than 30 countries adopting the solution, including AT&T and Bank of America.

      Oracle Cloud at Customer is designed to enable organizations to remove one of the biggest obstacles to cloud adoption—data privacy concerns related to where the data is stored. While organizations are eager to move their enterprise workloads to the public cloud, many have been constrained by business, legislative and regulatory requirements that have prevented them from being able to adopt the technology. These first-of-a-kind services provide organizations with choice in where their data and applications reside and a natural path to easily move business critical applications eventually to the public cloud.

      “Oracle Cloud at Customer is a direct response to the remaining barriers to cloud adoption and turning those obstacles into opportunities by letting customers choose the location of their cloud services,” said Thomas Kurian, president, product development, Oracle. “We are providing a unique service that enables our customers to leverage Oracle Cloud services, including SaaS, PaaS, and IaaS, both on their premises and in our cloud.  Customers gain all the benefits of Oracle’s robust cloud offerings, in their own datacenters, all managed and supported by Oracle.”

      Underpinning Oracle Cloud at Customer is a modern cloud infrastructure platform based on converged Oracle hardware, software-defined storage and networking and a first class IaaS abstraction. Oracle fully manages and maintains the infrastructure at customers’ premises so that customers can focus on using the IaaS, PaaS and SaaS services. This is the same cloud infrastructure platform that powers the Oracle Cloud globally.

      Based on overwhelming customer demand, Oracle continues to expand the services available via Oracle Cloud at Customer. With today’s news, customers now have access to all of Oracle’s major PaaS categories, including Database, Application Development, Analytics, Big Data, Application and Data Integration, and Identity Management. These services take advantage of specific enhancements that have been made to the underlying Oracle Cloud at Customer platform such as servers with faster CPUs and NVMe-based flash storage, as well as all-flash block storage to deliver even better performance for enterprise workloads.

      For the first time, Oracle has also made available via Oracle Cloud at Customer, the ability to consume Oracle SaaS services such as Enterprise Resource Planning, Human Capital Management, Customer Relationship Management, and Supply Chain Management in their own datacenters. These best-in-class, modern applications help unlock business value and increase performance by enabling businesses and people to be more informed, connected, productive, and engaged. Major organizations are already adopting this new option to modernize their key enterprise operations and benefit from the speed of innovation in Oracle SaaS without having to move sensitive application data outside their premises. With the addition of SaaS services to Oracle Cloud at Customer, customers have access to Oracle Cloud services across the entire cloud stack, all delivered in a subscription-based, managed model, directly in their datacenters.

      Also, newly available is the Oracle Big Data Cloud Machine, which is an optimized system delivering a production-grade Hadoop and Spark platform with the power of dedicated nodes and the flexibility and simplicity of a cloud offering. Organizations can now access a full range of Hadoop, Spark, and analytics tools on a simple subscription model in their own data centers.

      Oracle Cloud at Customer delivers the following Oracle Cloud services:

      • Infrastructure: Provides elastic compute, containers, elastic block storage, object storage, virtual networking, and identity management to enable portability of Oracle and non-Oracle workloads into the cloud.
      • Data Management: Enables customers to use the number one database to manage data infrastructure in the cloud with the Oracle Database Cloud, including Oracle Database Exadata Cloud for extreme performance and Oracle MySQL Cloud.
      • Big Data and Analytics:  Empowers an entire organization to use a single platform to take advantage of any data to drive insights. Includes a broad set of big data cloud services, including Oracle Big Data Cloud Service, Oracle Analytics Cloud, and Oracle Event Hub Cloud.
      • Application Development: Enables organizations to develop and deploy Java applications in the cloud using Oracle Java Cloud, Oracle Application Container Cloud, Oracle Container Cloud, and Oracle WebCenter Portal Cloud.
      • Enterprise Integration: Simplifies integration of on-premises applications to cloud applications, as well as cloud application to cloud application integration using Oracle Integration Cloud, Oracle SOA Cloud, Oracle Data Integrator Cloud, Oracle GoldenGate Cloud, Oracle Managed File Transfer Cloud, and Oracle Internet of Things Cloud.
      • Security: Enables organizations to use Oracle Identity Cloud to implement and manage consistent identity and access management policies.
      • Software-as-a-Service: Provides organizations with a complete suite of software to run their businesses, including Oracle ERP Cloud, Oracle CX Cloud, Oracle HCM Cloud, and Oracle Supply Chain Management Cloud.

      Customer Demand Drives Expansion of Portfolio

      Global organizations are turning to Oracle Cloud at Customer to standardize on a platform to modernize existing infrastructure and develop innovative new applications. Customers including City of Las Vegas, Federacion Colombiana de Municipios, Glintt Healthcare, HCPA, NEC, NTT DATA, Rakuten Card, State University of New York, and State Bank of India are benefitting from Oracle Cloud services from inside their own datacenters.

      “The City of Las Vegas is shifting its Oracle application workloads to the Oracle Cloud,” said Michael Sherwood, Director Information Technologies, city of Las Vegas. “By keeping the data in our data center, we retain full control while enabling innovation, gaining efficiencies and building applications to better serve our community.”

      “Today, public organizations are constantly innovating to meet the needs of our citizens. For the Colombian Federation of Municipalities, we have decided to digitally transform our territories to become smart cities,” said Alejandro Murillo, CIO of the Colombian Federation of Municipalities. “With Oracle Cloud at Customer, we have the technological capabilities to bring top-level solutions in the cloud to our municipalities, enabling them to operate with more agility and better serve our citizens.”

      “Oracle Cloud at Customer provides us with a consolidated solution to make sensitive healthcare data securely available,” said Nuno Vasco Lopes, CEO, Glintt Healthcare Solutions. “The efficient and flexible solution has reduced the total cost of ownership by 18 percent and delivered high customer performance.” 

      Oracle Cloud at Customer

      The Oracle Cloud at Customer portfolio of services enables organizations to get all of the benefits of Oracle’s public cloud services in their datacenters. The business model is just like a public cloud subscription; the hardware and software platform is the same; Oracle experts monitor and manage the infrastructure; and the same tools used in Oracle’s public cloud are used to provision resources on the Oracle Cloud at Customer services. This is the only offering from a major public cloud vendor that delivers a stack that is 100 percent compatible with the public cloud but available on-premises, ensuring that customers get the same experience and the latest innovations and benefits using it in their datacenters as in the public cloud. 

      Additional Resources


      Contact Info
      Nicole Maloney
      Oracle
      +1.415.235.4033
      nicole.maloney@oracle.com
      Kristin Reeves
      Blanc & Otus
      +1.415.856.5145
      kristin.reeves@blancandotus.com
      About Oracle

      The Oracle Cloud offers complete SaaS application suites for ERP, HCM and CX, plus best-in-class database Platform as a Service (PaaS) and Infrastructure as a Service (IaaS) from data centers throughout the Americas, Europe and Asia. For more information about Oracle (NYSE:ORCL), please visit us at www.oracle.com.

      Trademarks

      Oracle and Java are registered trademarks of Oracle and/or its affiliates. Other names may be trademarks of their respective owners.

      Safe Harbor

      The preceding is intended to outline our general product direction. It is intended for information purposes only, and may not be incorporated into any contract. It is not a commitment to deliver any material, code, or functionality, and should not be relied upon in making purchasing decisions. The development, release, and timing of any features or functionality described for Oracle's products remains at the sole discretion of Oracle Corporation. 


      Talk to a Press Contact

      Nicole Maloney

      • +1.415.235.4033

      Kristin Reeves

      • +1.415.856.5145

                Twigify Flat UI for D8 Big Data Drupal / HyperDrupal        

      stub

      I want to use a Twigified Flat UI for a D8 HyperDrupal / BigData Drupal demo I will be doing at Badcamp

      I probably wont get it finished, but I'd like to present a work in progress, warts and all

      https://drupal.org/sandbox/forest/1974984
      http://2013.badcamp.net/sessions/big-data-drupal-cloudera-hadoop-mapredu...
      https://drupal.org/node/2104503

      I am trying to convince a couple of others to share the stage with me to present / panel / workshop

      will you be at Badcamp ? I'd also like to Twigify the Unary theme

      as I said, I doubt I'll get it done, but I'd like to try and if it means something half baked, well, I can live with that


                Futurology – Changing Farming Through Big Data        
      The Futurology team speaks to The Better Trading Company Managing Director, Stephen Wormald, about how they are changing rural farming through big data. They chat about how this impacts food stability and the future sustainability of farmers.
                Mu Sigma Launches “Meta-software,” a New Approach to Analytical Software to Improve Business Problem Solving Using Big Data        

      New approach resolves limitations of flexibility and scalability while accelerating analytical problem solving and software reusability across multiple business problems

      (PRWeb January 11, 2017)

      Read the full story at http://www.prweb.com/releases/2017/01/prweb13973644.htm


                Senior data management expert / team leader munkakörbe keresünk munkatársat. | Feladatok: Coord...        
      Senior data management expert / team leader munkakörbe keresünk munkatársat. | Feladatok: Coordinate the business development of the data management practice at PwC Hungary; • Develop new business opportunities and value-enhancing data management / big data solutions; • Contribute to client portfolio planning and cooperate with other business development initiatives; • Management of data management team staffing, training and project allocation; • Create the frameworks for value-enhancing data management and data intelligence solutions for PwC clientele and internal projects; • Oversee the data management activities and provide guidance translating business objectives into analytic procedures; • Dissemination of data management procedures and technique, providing training materials and hold trainings for external and internal parties; • Professional review of analyses, reports, deliverables provided by junior colleagues.. | Mit ajánlunk: A challenging work environemnt in which you will face diverse and unique problems and situations; • A professional, team-oriented and dynamic workplace; • Up-to-date technologies and methodologies of our international network; • Professional development and training opportunities; • A competitive salary and benefits package cafeteria, laptop, mobile phone, and other benefits. | Elvárások: 4-5 years of experience with data analytics and/or business intelligence; • University/College degree is a must; • In-depth knowledge of data visualization solutions Tableau, PowerBI preferred; • Experience with using audit-focused data analytics tools ACL, IDEA would be preferred; • Familiarity and hands-on experience with SQL; • Fluency in programming languages preferably in Python and R would be an advantage; • Strong English language skills additional language skills would be an advantage; • Significant experience with leading ERP solutions SAP, Dynamics AX, Oracle • Experience in leading projects and manage group dynamics; • A proactive approach to business development and building client relationships; • A strong desire for continuous improvement and client facing responsibilities. • Demonstrated ability to think abstractly, solve problems and deal with ambiguity. | További infó és jelentkezés itt: www.profession.hu/allas/1055995
                Wrap-up: Cisco at HPC for Wall Street        
      We recently returned from another great experience at the High Performance Computing Linux for Wall Street event in New York on April 7, 2014. This year’s 11th annual HPC conference focused on big data, HPC applications, data centers fabrics, cloud economics, low latency and how these technologies are all changing the way global financial markets are evolving. […]
                How Entrepreneurs Are Winning By Understanding Big Data        
      Three years ago, CSC predicted that by 2020 data production will be 44 times greater than it was in 2009. Zetabytes (that’s one billionterabytes) of information, residing online and on internal databases, has become both a huge opportunity and a terrifying information overload for many companies. Both private and public [...]
                6 Ways Big Data Will Shape Online Marketing In 2015        
      Initially big data seemed to be something only available to the biggest businesses. Analytics are now being built into almost every application, however, making the technology accessible to businesses of all sizes. As businesses realize the power of information to create successful marketing campaigns and see real-time results, data is [...]
                Big data and new tech helping Illinois farmers        
      Illinois farmers have traditionally used observations about their fields to make decisions, but experts say big data and new technology
                Data Integration is the Foundation        

      Unless you live under a rock, you’ve seen the buzz about Data Lakes, Big Data, Data Mining, Cloud-tech, and Machine Learning. I watch and read reports from two perspectives: technical and as a consultant.

      As a Consultant

      If you watch CNBC, you won’t hear discussions about ETL Incremental Load or Slowly Changing Dimensions Design Patterns. You will hear them using words like “cloud” and “big data,” though. That means people who watch and respect the people on CNBC are going to hire consultants who are knowledgeable about cloud technology and Big Data.

      As an Engineer

      I started working with computers in 1975. Since that time, I believe I’ve witnessed about one major paradigm shift per decade. I believe I am now witnessing two at the same time: 1) A revolution in Machine Learning and all the things it touches (which includes Big Data and Data Lakes); and 2) the Cloud. These two are combining in some very interesting ways. Data Lakes and Big Data appliances and systems are the sources for many systems, Machine Learning and Data Mining solutions are but a couple of their consumers. At the same time, much of this technology and storage is either migrating to the Cloud, or is being built there (and in some cases, only there). But all of this awesome technology depends on something…

      Data

      In order for Machine Learning or Data Mining to work, there has to be data in the Data Lake or in the Big Data appliance or system. Without data, the Data Lake is dry. Without data, there’s no “Big” in Big Data. How do these solutions acquire data?

      It Depends

      Some of these new systems have access to data locally. But many of them – most, if I may be so bold – require data to be rounded up from myriad sources. Hence my claim that data integration is the foundation for these new solutions.

      What is Data Integration and Why is it Important?

      Data integration is the collection of data from myriad, disparate sources into a single (or minimal number of) repository (repositories). It’s “shipping” the data from where it is to someplace “nearer.” Why is this important? Internet connection speeds are awesome these days. I have – literally – 20,000 times more bandwidth than when I first connected to the internet. But modern internet connection speeds are hundreds-to-millions times slower than networks running inside data centers. Computing power – measured in cycles or flops per second – is certainly required to perform today’s magic with Machine Learning. But if the servers must wait hours (or longer) for data – instead of milliseconds? The magic happens in slow-motion. In slow-motion, magic doesn’t look awesome at all.

      Trust me, speed matters.

      Data integration is the foundation on which most of these systems depend. Some important questions to consider:

      • Are you getting the most out of your enterprise data integration?
      • Could your enterprise benefit from faster access to data – perhaps even near real-time business intelligence?
      • How can you improve your enterprise data integration solutions?

      :{>

      Learn more:

      Enterprise Data & Analytics
      Stairway to Integration Services
      IESSIS1: Immersion Event on Learning SQL Server Integration Services
      EnterpriseDNA Training


                SAP Ariba and MercadoLibre to consumerize business commerce in Latin America         
      The next BriefingsDirect global digital business panel discussion explores how the expansion of automated tactical buying for business commerce is impacting global markets, and what's in store next for Latin America.

      We’ll specifically examine how “spot buy” approaches enable companies to make time-sensitive and often mission-critical purchases, even in complex and dynamic settings, like Latin America.

      Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

      To learn more about the rising tide of such tactical business buying improvements, please join our guests, Karen Bruck, Corporate Sales Director at MercadoLibre.comin Buenos Aires, Argentina; Diego Cabrera Canay, Director of Financial Planning at MercadoLibre, and Tony Alvarez, General Manager of SAP Ariba's Spot Buy Business. The panel was recorded at the recent 2017 SAP Ariba LIVE conference in Las Vegas, and is moderated by Dana Gardner, principal analyst at Interarbor Solutions.

      Here are some excerpts:

      Gardner: SAP Ariba Spot Buy has been in the market a few years. Tell us about where it has rolled out so far, why certain markets are being approached, and then about Latin America specifically.

      Alvarez
      Alvarez: The concept is a few years old, but we've been delivering SAP Ariba Spot Buy for about a year. We began in the US, and over the past 12 months the concept of Spot Buy has progressed because of our customer base. Our customer base has pushed us in a direction that is, quite frankly, even beyond Spot Buy -- and it’s getting into trusted, vetted content.

      We are approaching the market with a two-pronged strategy of, yes, we have the breadth of content so that when somebody goes into an SAP Ariba application they can find what they are looking for, but we also now have parameters and controls that allow them to vet that content and to put a filter on it.

      Over the last 12 months, we've come a long way. We are live in the US, and with early access in the UK and Germany. We just went live in Australia, and now we are very much looking forward to going live and moving fast into Latin America with MercadoLibre.

      Gardner: Spot buying, or tactical buying, is different from strategic or more organized long-term buying. Tell us about this subset of procurement.

      Alvarez: SAP Ariba is a 20 year-old company, and its roots are in that rigorous, sourced approach. We do hundreds of billions of dollars through contract catalog on the Ariba Network, but there's a segment -- and we believe it's upward of 15% of spend -- that is spot buy spend. The procurement professional often has no idea what's being bought. And I think there are two approaches to that -- either ignorance is bliss and they are glad that it’s out of their purview, or it also keeps them up at night.

      SAP Ariba Spot Buy allows them to have visibility into that spend. By partnering with providers like MercadoLibre, they have content from trusted and vetted sellers to bring to the table – so it's a really nice match for procurement.

      Liberating limits

      Gardner: The trick is to allow for flexibility and being dynamic, but also putting in enough rules and policies so that things don’t go off-track.

      Alvarez:Exactly. For example, it’s like putting a filter on your kids’ smartphone. You want them to be able to be liberated so they can go and do as they please with phone calls -- but not to go off the guardrails.

      Gardner: Karen, tell us about MercadoLibre and why Latin America might be a really interesting market for this type of Spot Buy service.

      Bruck: MercadoLibre is a leading e-commerce platform in Latin America, where we provide the largest marketplaces in 16 different countries. Our main markets are Brazil, Mexico, and Argentina, and that’s where we are going the start this partnership with SAP Ariba.

      Bruck
      We have upward of 60 million items listed on our platform, and this breadth of supplies will make purchasing very exciting. Latin America is a complicated market -- and we like this complexity. We do very well.

      It’s complicated because there are different rates of inflation in different countries, and so contracts can be hard to complete. What we bring to the table is an assortment of great payment and shipping solutions that make it easy for companies to purchase items. As Tony was saying, these are not under long-term contracts, but we still get to make use of this vast supply.

      Gardner: Tony mentioned that maybe 15% of spend is in this category. Diego, do you think that that number might be higher in some of the markets that you serve?

      Cabrera Canay: That’s probably the number -- but that is a big number in terms of the spend within companies. So we have to get there and see what happens.

      Progressive partnership 

      Gardner: Tony, tell us about the partnership. What is MercadoLibre.com bringing to the table? What is Ariba bringing to the table? How does this fit together for a whole that is greater than the sum of its parts?

      Alvarez: It really is a well-matched partnership. SAP Ariba is the leading cloud procurement platform, period. When you look in Latin America, our penetration with SAP Enterprise Resource Planning (ERP) is even greater. We have a very strong installed base with SAP ERP.

      Our plan is to take the SAP Ariba Spot Buy content and make it available to the SAP installed base. So this goes way beyond just SAP Ariba. And when you think about what Karen mentioned -- difficulties in Latin America with high inflation -- the catalog approach is not used as much in Latin America because everything is so dynamic.

      For example, you might sign a contract but in just in a couple of weeks that contract may be obsolete, or unfavorable because of a change in pricing. But once we build controls and parameters in SAP Ariba Spot Buy, you can layer that on top of MercadoLibre content, which is super-broad. If you're looking for it you’re going to find it, and that content is constantly updated. You gain real-time access to the latest information, and then the procurement person gets the benefit of control.

      So I'm very optimistic. As Diego mentioned, I think 15% is really on the low-end in Latin America for this type of spend. I think this will be a really nice way to put digital catalog buying in the hands of large enterprise buyers.

      Gardner: Speaking of large enterprise buyers, if I'm a purchasing official in one of your new markets, what should I be thinking about how this is going to benefit me?

      Transparent, trusted transactions

      It saves a lot of time, it makes the comparison very transparent, and you are able to control the different options. Overall, it's a win-win ... a partnership, a match made in heaven.
      Bruck: Let me talk about this from experience. As a country manager at MercadoLibre, I had to do a lot of the procurement, together with our procurement officers. It was really frustrating at times because all of these purchases had to be one-off engagements, with a different vendor every time. That takes a lot of time. You also have to bring in price comparisons, and that’s not always a simple process.

      So what this platform gives you is the ability to be very transparent about prices and among different supplies. That makes it very easy to be able to buy every time without having to call and get the vendor to be in your own buying platform.

      It saves a lot of time, it makes the comparison very transparent, and you are able to control the different options. Overall, it’s a win-win. So I do believe this is a partnership, a match made in heaven.

      We were also very interested in business-to-business (B2B) industries. When Tony and SAP Ariba came to our offices to offer this partnership, we thought this would be a great way to leverage their needs with our supply and make it work.

      Gardner: For sellers, this enables them to do repeated business more easily, more automated and so at scale. For buyers, with transparency they have more insight into getting the best prices, the best terms of delivery. Let's expand on that win-win. Diego, tell us about the business benefits for all parties.

      Big and small, meet at the mall 

      Cabrera Canay: In the past few years, we have been working to make MercadoLibre the biggest “mall” in e-commerce. We have the most important brands and the most important retailers selling through MercadoLibre.

      Cabrera Canay
      What differentiates us is that we are confident we have the best prices -- and also other great services such as free shipping, easy payments, and financing. We are sure that we can offer the buyers better purchasing.

      Obviously, from the side of sellers, this all provides higher demand, it raises the bar in terms of having qualified buyers, and then giving the best services. That’s very exciting for us.

      Gardner: Tony, we mentioned large enterprises, but this cuts across a great deal more of the economy, such as small- to medium sized (SMB) businesses. Tell us about how this works across diverse economies where there are large players but lots of small ones, too?

      Alvarez: On the sales side, this gives really small businesses opportunity to reach large enterprise buyers that probably weren’t there before.

      Diego was being modest, but MercadoLibre's payment structure, MercadoPago, is incredibly robust, and it's incredibly valuable to that end-seller, and also to the buyer.

      Just having that platform and then connecting -- you are basically taking two populations, the large and small sellers, and the large and small buyers, and allowing them to commingle more than they ever had in the past.

      Gardner: Karen, as you mentioned from your own experience, when you're dealing with paper, and you are dealing with one-offs, it's hard to just keep track of the process, never mind to analyze it. But when we go digital, when we have a platform, when we have business networks at work, then we can start to analyze things for companies -- and more broadly into markets.

      How do you see this partnership accelerating the ability to leverage analytics, leverage some of the back-end platform technologies with SAP HANAand SAP Ariba, and making more strides toward productivity for your customers?

      Data discoveries

      Bruck:Right. When everything is tracked, as this will be, because every single purchase will be inside their SAP Ariba platform, it is all part of your “big data.” So then you can actually drop it, control it, analyze it, and say, “Hey, maybe these particular purchases mean that we should have long-term contracts, or that our long-term contracts were not priced correctly,” and maybe that's an opportunity to save money and lower costs.

      So once you can track data, you can do a lot of things, and discover new opportunities for either being more efficient or reducing costs – and that's ultimately what we all want in all the departments of our companies.

      Gardner: And for those listeners and readers who are interested in taking advantage of these services, and ultimately that great ability to analyze, what should they be doing now to get ready? Are there some things they could do culturally, organizationally, in order to become that more digital business when these services are available to them?
      Paper is terrible for companies; you have to rethink your purchase processing in a digital way.

      Cabrera Canay: I can talk about in our own case, where we are rebuilding our purchase processes. Paper is terrible for companies; you have to rethink your purchase processing in a digital way. Once you do it, SAP Ariba is a great solution, and with SAP Ariba Spot Buy we will have the best conditions for the buyers.

      Bruck: It’s a natural process. People are going digital and embracing these new trends and technologies. It will make them more efficient. If they get up to speed quickly, it will become less about controlling stuff that they don't need to control. They will really understand the benefits, so it will be a natural adoption.

      Gardner: Tony, coming back full circle, as you have rolled SAP Ariba Spot Buy out from North America to Europe to Asia-Pacific, and now to Latin America -- what have you learned in the way people use it?

      Alvarez: First, at a macro level, people have found this to be a useful tool to replace some of the contracts that were less important, and so they can rely on marketplaces.

      Second, we’ve really found as we’ve deployed in the US that a lot of times multinational companies are like, “Hey, that's great, I love this, but I really want to use this in Latin America.” So they want to go and get visibility elsewhere.

      Turn-key technique

      Third, they want a tool that doesn't require any training. If I’m a procurement professional, I want my users to already be expert at using the tool. We've designed this in the process context, and in concert with the content partners. You can just walk up and start using it. You don’t have to be an expert, and it keeps you within the guardrails without even thinking about it.

      Gardner: And being a cloud-based, software-as-a-service (SaaS) solution you're always analyzing how it's being used -- going after that ultimate optimized user experience -- and then building those improvements back in on a constant basis?

      Alvarez:Exactly. Always.

      Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: SAP Ariba.

      You may also be interested in:


                Experts define new ways to manage supply chain risk in a digital economy        
      The next BriefingsDirect digital business thought leadership panel discussion explores new ways that companies can gain improved visibility, analytics, and predictive responses to better manage supply chain risk in the digital economy.

      The panel examines how companies such as Nielsen are using cognitive computing search engines, and even machine learning and artificial intelligence (AI), to reduce risk in their overall buying and acquisitions.

      Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

      To learn more about the exploding sophistication around gaining insights into advanced business commerce, we welcome James Edward Johnson, Director of Supply Chain Risk Management and Analysis at Nielsen; Dan Adamson, Founder and CEO of OutsideIQ in Toronto, and Padmini Ranganathan, Vice President of Products and Innovation at SAP Ariba.

      The panel was assembled and recorded at the recent 2017 SAP Ariba LIVE conference in Las Vegas. The discussion is moderated by Dana Gardner, principal analyst at Interarbor Solutions.

      Here are some excerpts:

      Gardner: Padmini, we heard at SAP Ariba LIVE that risk is opportunity. That stuck with me. Are the technologies really now sufficient that we can fully examine risks to such a degree that we can turn that into a significant business competitive advantage? That is to say, those who take on risk seriously, can they really have a big jump over their competitors?

      Ranganathan
      Ranganathan:I come from Silicon Valley, so we have to take risks for startups to grow into big businesses, and we have seen a lot of successful entrepreneurs do that. Clearly, taking risks drives bigger opportunity.

      But in this world of supplier and supply chain risk management, it’s even more important and imperative that the buyer and supplier relationships are risk-aware and risk-free. The more transparent that relationship becomes, the more opportunity for driving more business between those relationships.

      That context of growing business -- as well as growing the trust and the transparent relationships -- in a supply chain is better managed by understanding the supplier base. Understanding the risks in the supplier base, and then converting them into opportunities, allows mitigating and solving problems jointly. By collaborating together, they form partnerships.

      Gardner: Dan, it seems that what was once acceptable risk can now be significantly reduced. How do people in procurement and supply chain management know what acceptable risk is -- or maybe they shouldn’t accept any risk?

      Adamson
      Adamson:My roots are also from Silicon Valley, and I think you are absolutely right that at times you should be taking risks -- but not unnecessarily. What the procurement side has struggled with -- and this is from me jumping into financial institutions where they treat risk very differently through to procurement – is risk versus the price-point to avoid that risk. That’s traditionally been the big problem.

      For every vendor that you on-board, you have to pay $1,000 for a due diligence report and it's really not price-effective. But, being able to maintain and monitor that vendor on a regular basis at acceptable cost – then there's a real risk-versus-reward benefit in there.

      What we are helping to drive are a new set of technology solutions that enable a deeper level of due diligence through technology, through cognitive computing, that wasn't previously possible at the price point that makes it cost-effective. Now it is possible to clamp down and avoid risk where necessary.

      Gardner: James, as a consumer of some of these technologies, do you really feel that there has been a significant change in that value equation, that for less money output you are getting a lot less risk?

      Knowing what you're up against  

      Johnson: To some degree that value was always there; it was just difficult to help people see that value. Obviously tools like this will help us see that value more readily.

      It used to be that in order to show the value, you actually had to do a lot of work, and it was challenging. What we are talking about here is that we can begin to boil the ocean. You can test these products, and you can do a lot of work just looking at test results.

      Johnson
      And, it's a lot easier to see the value because you will unearth things that you couldn't have seen in the past.

      Whereas it used to take a full-blown implementation to begin to grasp those risks, you can now just test your data and see what you find. Most people, once they have their eyes wide open, will be at least a little more fearful.  But, at the same time -- and this goes back to the opportunity question you asked -- they will see the opportunity to actually tackle these risks. It’s not like those risks didn't exist in the past, but now they know they are there -- and they can decide to do something about it, or not.

      Gardner:So rather than avoid the entire process, now you can go at the process but with more granular tools to assess your risks and then manage them properly?

      Johnson:That's right. I wouldn't say that we should have a risk-free environment; that would cost more money than we’re willing to pay. That said, we should be more conscious of what we're not yet willing to pay for.

      Rather than just leaving the risk out there and avoiding business where you can’t access information about what you don't know -- now you'll know something. It's your choice to decide whether or not you want to go down the route of eliminating that risk, of living with that risk, or maybe something in between. That's where the sweet spot is. There are probably a lot of intermediate actions that people would be taking now that are very cheap, but they haven't even thought to do so, because they haven’t assessed where the risk is.

      Gardner: Padmini, because we're looking at a complex landscape -- a supply chain, a global supply chain, with many tiers -- when we have a risk solution, it seems that it's a team sport. It requires an ecosystem approach. What has SAP Ariba done, and what is the news at SAP Ariba LIVE? Why is it important to be a team player when it comes to a fuller risk reduction opportunity?

      Teamwork

      Ranganathan:You said it right. The risk domain world is large, and it is specialized. The language that the compliance people use in the risk world is somewhat similar to the language that the lawyers use, but very different from the language that the information technology (IT) security and information security risk teams use.

      The reason you can’t see many of the risks is partly because the data, the information, and the fragmentation have been too broad, too wide. It’s also because the type of risks, and the people who deal with these risks, are also scattered across the organization.
      It’s not like those risks didn't exist in the past, but now they know they are there -- and they can decide to do something about it, or not.

      So a platform that supports bringing all of this together is number one. Second, the platform must support the end-to-end process of managing those supply chain relationships, and managing the full supply chain and gain the transparency across it. That’s where SAP Ariba has headed with Direct Materials Sourcing and with getting more into supply chain collaboration. That’s what you heard at SAP Ariba LIVE.

      We all understand that supply chain much better when we are in SAP Ariba, and then you have this ecosystem of partners and providers. You have the technology with SAP and HANA to gain the ability to mash up big data and set it in context, and to understand the patterns. We also have the open ecosystem and the open source platform to allow us to take that even wider. And last but not the least, there is the business network.

      So it’s not just between one company and another company, it's a network of companies operating together. The momentum of that collaboration allows users to say, “Okay, I am going to push for finding ethical companies to do business with,” -- and then that's really where the power of the network multiplies.

      Gardner: Dan, when a company nowadays buys something in a global supply chain, they are not just buying a product -- they are buying everything that's gone on with that product, such as the legacy of that product, from cradle to PO. What is it that OutsideIQ brings to the table that helps them get a better handle on what that legacy really is?

      Dig deep, reduce risk, save time

      Adamson: Yes, and they are not just buying from that seller, they are buying from the seller that sold it to that seller, and so they are buying a lot of history there -- and there is a lot of potential risk behind the scenes.

      That’s why this previously has been a manual process, because there has been a lot of contextual work in pulling out those needles from the haystack. It required a human level of digging into context to get to those needles.

      The exciting thing that we bring is a cognitive computing platform that’s trainable -- and it's been trained by FinCrime’s experts and corporate compliance experts. Increasingly, supply management experts help us know what to look for. The platform has the capability to learn about its subject, so it can go deeper. It can actually pivot on where it's searching. If it finds a presence in Afghanistan, for example, well then that's a potential risk in itself, but it can then go dig deeper on that.

      And that level of deeper digging is something that a human really had to do before. This is the exciting revolution that's occurring. Now we can bring back that data, it can be unstructured, it can be structured, yet we can piece it together and provide some structure that is then returned to SAP Ariba.

      The great thing about the supply management risk platform or toolkit that's being launched at SAP Ariba LIVE is that there’s another level of context on top of that. Ariba understands the relationship between the supplier and the buyer, and that's an important context to apply as well.

      How you determine risk scores on top of all of that is very critical. You need to weed out all of the noise, otherwise it would be a huge data science exercise and everyone would be spinning his or her wheels.
      SAP Ariba understands the relationship between the supplier and the buyer, and that's an important context to apply.

      This is now a huge opportunity for clients like James to truly get some low-hanging fruit value, where previously it would have been literally a witch-hunt or a huge mining expedition. We are now able to achieve this higher level of value.

      Gardner: James, Dan just described what others are calling investigative cognitive computing brought to bear on this supply chain risk problem. As someone who is in the business of trying to get the best tools for their organization, where do you come down on this? How important is this to you?

      Johnson: It's very important. I have done the kinds of investigations that he is talking about. For example, if I am looking at a vendor in a high-risk country, particularly a small vendor that doesn't have an international presence  that is problematic for most supplier investigations. What do I do? I will go and do some of the investigation that Dan is talking about.

      Now I'm usually sitting at my desk in Chicago. I'm not going out in the world. So there is a heightened level of due-diligence that I suspect neither of us are really talking about here. With that limitation, you want to look up not only the people, you want to look up all their connections. You might have had a due-diligence form completed, but that's an interested party giving you information, what do you do with it?

      Well, I can run the risk search on more than just the entity that I'm transacting with.  I am going to run it on everyone that Dan mentioned. Then I am going to look up all their LinkedIn profiles, see who they are connected to. Do any of those people show any red flags? I’d look at the bank that they use. Are there any red flags with their bank?

      I can do all that work, and I can spend several hours doing all that work. As a lawyer I might dig a little deeper than someone else, but in the end, it's human labor going into the effort.

      Gardner: And that really doesn't scale very well.

      Johnson: That does not scale at all. I am not going to hire a team of lawyers for every supplier. The reality here is that now I can do some level of that time-consuming work with every supplier by using the kind of technology that Dan is talking about.

      The promise of OutsideIQ technology is incredible. It is an early and quickly expanding, opportunity. It's because of relationships like the one between SAP Ariba and OutsideIQ that I see a huge opportunity between Nielsen and SAP Ariba. We are both on the same roadmap.

      Nielsen has a lot of work to do, SAP Ariba has a lot of work to do, and that work will never end, and that’s okay. We just need to be comfortable with it, and work together to build a better world.

      Gardner: Tell us about Nielsen. Then secondarily, what part of your procurement, your supply chain, do you think this will impact best first?

      Automatic, systematic risk management

      Johnson: Nielsen is a market research company. We answer two questions: what do people watch? And what do people buy? It sounds very simple, but when you cover 90% of the world’s population, which we do – more than six billion people -- you can imagine that it gets a little bit more complicated.

      We house about 54 petabytes of database data. So the scale there is huge. We have 43,000 employees. It’s not a small company. You might know Nielsen for the set-top boxes in the US that tell what the ratings were overnight for the Super Bowl, for example, but it’s a lot more than that. And you can imagine, especially when you're trying to answer what do people buy in  developing countries with emerging economies? You are touching some riskier things.

      In terms of what this SAP Ariba collaboration can solve for us, the first quick hit is that we will no longer have to leverage multiple separate sources of information. I can now leverage all the sources of information at one time through one interface. It is already being used to deliver information to people who are involved in the procurement chain. That's the huge quick win.

      The secondary win is from the efficiency that we get in doing that first layer of risk management. Now we can start to address that middle tier that I mentioned. We can respond to certain kinds of risk that, today, we are doing ad-hoc, but not systematically. There is that systematic change that will allow us to not only target the 100 to 200 vendors that we might prioritize -- but the thousands of vendors that are somewhere in our system, too.

      That's going to revolutionize things, especially once you fold in the environmental, social and governance (ESG) work that, today, is very focused for us. If I can spread that out to the whole supply chain, that's revolutionary. There are a lot of low-cost things that you can do if you just have the information.
      What is the good in the world that’s freely available to me, that I'm not even touching? That's amazing.

      So it’s not always a question of, “am I going to do good in the world and how much is it going to cost me?” It’s really a question of, “What is the good in the world that’s freely available to me, that I'm not even touching?” That's amazing! And, that's the kind of thing that you can go to work for, and be happy about your work, and not just do what you need to do to get a paycheck.

      Gardner: It’s not just avoiding the bad things; it’s the false positives that you want to remove so that you can get the full benefit of a diverse, rich supplier network to choose from.

      Johnson: Right, and today we are essentially wasting a lot of time on suspected positives that turn out to be false. We waste time on them because we go deeper with a human than we need to. Let’s let the machines go as deep as they can, and then let the humans come in to take over where we make a difference.

      Gardner: Padmini, it’s interesting to me that he is now talking about making this methodological approach standardized, part of due-diligence that's not ad-hoc, it’s not exception management. As companies make this a standard part of their supply chain evaluations, how can we make this even richer and easier to use?

      Ranganathan: The first step was the data. It’s the plumbing; we have to get that right. It’s about the way you look at your master data, which is suppliers; the way you look at what you are buying, which is categories of spend; and where you are buying from, which is all the regions. So you already have the metrics segmentation of that master data, and everything else that you can do with SAP Ariba.

      The next step is then the process, because it’s really not a one-size-fits-all. It cannot be a one-size-fits-all, where every supplier that you on-board you are going to ask them the same set of questions, check the box and move on.

      I am going to use the print service vendor example again, which is my favorite. For marketing materials printing, you have a certain level of risk, and that's all you need to look at. But you still want, of course, to look at them for any adverse media incidents, or whether they suddenly got on a watch-list for something, you do want to know that.

      But when one of your business units begins to use them for customer-confidential data and statement printing -- the level of risk shoots up. So the intensity of risk assessments and the risk audits and things that you would do with that vendor for that level of risk then has to be engineered and geared to that type of risk.

      So it cannot be a one-size-fits-all; it has to go past the standard. So the standardization is not in the process; the standardization is in the way you look at risk so that you can determine how much of the process do I need to apply and I can stay in tune.

      Gardner: Dan, clearly SAP Ariba and Nielsen, they want the “dials,” they want to be able to tune this in. What’s coming next, what should we expect in terms of what you can bring to the table, and other partners like yourselves, in bringing the rich, customizable inference and understanding benefits that these other organizations want?

      Constructing cognitive computing by layer

      Adamson: We are definitely in early days on the one hand. But on the other hand, we have seen historically many AI failures, where we fail to commercialize AI technologies. This time it's a little different, because of the big data movement, because of the well-known use cases in machine learning that have been very successful, the pattern matching and recommending and classifying. We are using that as a backbone to build layers of cognitive computing on top of that.

      And I think as Padmini said, we are providing a first layer, where it’s getting stronger and stronger. We can weed out up to 95% of the false-positives to start from, and really let the humans look at the thorny or potentially thorny issues that are left over. That’s a huge return on investment (ROI) and a timesaver by itself.

      But on top of that, you can add in another layer of cognitive computing, and that might be at the workflow layer that recognizes that data and says, “Jeez, just a second here, there's a confidentiality potential issue here, let's treat this vendor differently and let's go as far as plugging in a special clause into the contract.” This is, I think, where SAP Ariba is going with that. It’s building a layer of cognitive computing on top of another layer of cognitive computing.

      Actually, human processes work like that, too. There is a lot of fundamental pattern recognition at the basis of our cognitive thought, and on top of that we layer on top logic. So it’s a fun time to be in this field, executing one layer at a time, and it's an exciting approach.

      Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: SAP Ariba.

      You may also be interested in:


                How AI, IoT and blockchain will shake up procurement and supply chains         
      The next BriefingsDirect digital business thought leadership panel discussion focuses on how artificial intelligence (AI), the Internet of things (IoT), machine learning (ML), and blockchainwill shake up procurement and supply chain optimization.

      Stay with us now as we develop a new vision for how today's cutting-edge technologies will usher in tomorrow's most powerful business tools and processes. The panel was assembled and recorded at the recent 2017 SAP Ariba LIVE conference in Las Vegas. The discussion is moderated by Dana Gardner, principal analyst at Interarbor Solutions.

      Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.
      To learn more about the data-driven, predictive analytics, and augmented intelligence approach to supply chain management and procurement, please welcome the executives from SAP Ariba:
      Here are some excerpts:

      Gardner: It seems like only yesterday we were confident to have a single view of a customer, or clean data, or maybe a single business process end–to-end value. But now, we are poised to leapfrog the status quo by using words like predictive and proactivefor many business functions.

      Why are AI and ML such disrupters to how we've been doing business processes?

      Shahane: If you look back, some of the technological impact  in our private lives, is impacting our public life. Think about the amount of data and signals that we are gathering; we call it big data.

      We not only do transactions in our personal life, we also have a lot of content that gets pushed at us. Our phone records, our location as we move, so we are wired and we are hyper-connected.

      Shahane
      Similar things are happening to businesses. Since we are so connected, a lot of data is created. Having all that big data – and it could be a problem from the privacy perspective -- gives you an opportunity to harness that data, to optimize it and make your processes much more efficient, much more engaged.

      If you think about dealing with big data, you try and find patterns in that data, instead of looking at just the raw data. Finding those patterns collectively as a discipline is called machine learning. There are various techniques, and you can find a regression pattern, or you can find a recommendation pattern -- you can find all kinds of patterns that will optimize things, and make your experience a lot more engaging.

      If you combine all these machine learning techniques with tools such as natural language processing (NLP), higher-level tools such as inference engines, and text-to-speech processing -- you get things like Siriand Alexa. It was created for the consumer space, but the same thing could be available for your businesses, and you can train that for your business processes. Overall, these improve efficiency, give delight, and provide a very engaging user experience.

      Gardner:Sanjay, from the network perspective it seems like we are able to take advantage of really advanced cloud services, put that into a user experience that could be conversational, like we do with our personal consumer devices.

      What is it about the cloud services in the network, however, that are game-changers when it comes to applying AI and ML to just good old business processes?

      Multiple intelligence recommended

      Almeida
      Almeida:Building on Dinesh’s comment, we have a lot of intelligent devices in our homes. When we watch Netflix, there are a lot of recommendations that happen. We control devices through voice. When we get home the lights are on. There is a lot of intelligence built into our personal lives. And when we go to work, especially in an enterprise, the experience is far different. How do we make sure that your experience at home carries forward to when you are at work?

      From the enterprise and business networks perspective, we have a lot of data; a lot of business data about the purchases, the behaviors, the commodities. We can use that data to make the business processes a lot more efficient, using some of the models that Dinesh talked about.

      How do we actually do a recommendation so that we move away from traditional search, and take action on rows and columns, and drive that through a voice interface? How do we bring that intelligence together, and recommend the next actions or the next business process? How do we use the data that we have and make it a more recommended-based interaction versus the traditional forms-based interaction?

      Gardner: Sudhir, when we go out to the marketplace with these technologies, and people begin to use them for making better decisions, what will that bring to procurement and supply chain activities? Are we really talking about letting the machines make the decisions? Where does the best of what machines do and the best of what people do meet?

      Bhojwani
      Bhojwani: Quite often I get this question, What will be the role of procurement in 2025? Are the machines going to be able to make all the decisions and we will have no role to play? You can say the same thing about all aspects of life, so why only procurement?

      I think human intelligence is still here to stay. I believe, personally, it can be augmented. Let's take a concrete example to see what it means. At SAP Ariba, we are working on a product called product sourcing. Essentially this product takes a bill of material (BOM), and it tells you the impact. So what is so cool about it?

      One of our customers has a BOM, which is an eight-level deep tree with 10 million nodes in it. In this 10 million-node commodity tree, or BOM, a person is responsible for managing all the items. But how does he or she know what is the impact of a delay on the entire tree? How do you visualize that?

      I think humans are very poor at visualizing a 10-million node tree; machines are really good at it. Well, where the human is still going to be required is that eventually you have to make a decision. Are we comfortable that the machine alone makes a decision? Only time will tell. I continue to think that this kind of augmented intelligence is what we are looking for, not some machine making complete decisions on our behalf.

      Gardner: Dinesh, in order to make this more than what we get in our personal consumer space, which in some cases is nice to have, it doesn't really change the game. But we are looking for a higher productivity in business. The C-Suiteis looking for increased margins; they are looking for big efficiencies. What is it from a business point of view that these technologies can bring? Is this going to be just a lipstick on a pig, so to speak, or do we really get to change how business productivity comes about?

      Humans and machines working together

      Shahane: I truly believe it will change the productivity. The whole intelligence advantage -- if you look at it from a highest perspective like enhanced user experience -- provides an ability to help you make your decisions.

      When you make decisions having this augmented assistant helping you along the way -- and at the same time dealing with large amount of data combined in a business benefit -- I think it will make a huge impact.

      Let me give you an example. Think about supplier risk. Today, at first you look at risk as the people on the network, and how you are directly doing business with them. You want to know everything about them, their profile, and you care about them being a good business partner to you.

      But think about the second, third and fourth years, and some things become not so interesting for your business. All that information for those next years is not directly available on the network; that is distant. But if those signals can be captured and somehow surface in your decision-making, it can really reduce risk.
      Reducing risk means more productivity, more benefits to your businesses. So that is one advantage I could see, but there will be a number of advantages. I think we'll run out of time if we start talking about all of those.

      Gardner:Sanjay, help us better understand. When we take these technologies and apply them to procurement, what does that mean for the procurement people themselves?

      Almeida: There are two inputs that you need to make strategic decisions, and one is the data. You look at that data and you try to make sense out of it. As Sudhir mentioned, there is a limit to human beings in terms of how much data processing that they can do -- and that's where some of these technologies will help quite a bit to make better decisions.

      The other part is personal biases, and eliminating personal biases by using the data. It will improve the accuracy of your strategic decisions. A combination of those two will help make better decisions, faster decisions, and procurement groups can focus on the right stuff, versus being busy with the day-to-day tasks.

      Using these technologies, the data, and the power of the data from computational excellence -- that's taking the personal biases out of making decisions. That combination will really help them make better strategic decisions.

      Bhojwani: Let me add something to what Sanjay said. One of the biggest things we're seeing now in procurement, especially in enterprise software in general, is people's expectations have clearly gone up based on their personal experience outside. I mean, 10 years back I could not have imagined that I would never go to a store to buy shoes. I thought, who buys shoes online? Now, I never go to stores. I don't know when was the last time I bought shoes anywhere but online? It's been few years, in fact. Now, think about that expectation on procurement software.

      Currently procurement has been looked upon as a gatekeeper; they ensure that nobody does anything wrong. The problem with that approach is it is a “stick” model, there is no “carrot” behind it. What users want is, “Hey, show me the benefit and I will follow the rules.” We can't punish the entire company because of a couple of bad apples.

      By and large, most people want to follow the rules. They just don't know what the rules are; they don't have a platform that makes that decision-making easy, that enables them to get the job done sooner, faster, better. And that happens when the user experience is acceptable and where procurement is no longer looked down upon as a gatekeeper. That is the fundamental shift that has to happen, procurement has to start thinking about themselves as an enabler, not a gatekeeper. That's the fundamental shift.