Watch GDM Advisor Joe Wilhelmy interview Jason Stangel of Slalom and Mark Mims of Google. This is a candid discussion on how to get started with Artificial Intelligence, along with best practices and practical insights for a successful AI program.Read More
It is generally accepted that successful businesses thrive by consistently making better decisions than their competitors, and the agriculture industry is no exception.
Through the application of artificial intelligence (AI) and machine learning (ML), growers can access increasingly sophisticated data and analytics tools, which enables better decisions, improved efficiencies, and reduced waste in food and biofuel production, all while minimizing negative environmental consequences.
Here is a list of innovative ways professionals in the agriculture industry are taking advantage of the powerful solutions now available through AI technologies.
“Digital farming brings increased precision to crop production by supporting … key farm management decisions with data-driven insights.”1
Crop production plays a critical role in the food and biofuel industries worldwide, and ML is radically improving the way farmers contribute on both fronts.
According to Object Computing Vice President, Machine Learning Solutions, Dr. Jason Bull, “Farmers make hundreds of complex and interconnected decisions every year that impact their risk, sustainability, and business returns.”
Using sensors in the field in concert with ML-enabled digital applications, farmers now have the means to predict harvest yields and evaluate crop quality, identify plant species, and detect crop disease and weed infestations in ways that were previously impossible.2
YIELD PREDICTION AND QUALITY ASSESSMENT
Through the application of ML technology, a farmer can log into a customized dashboard on a computer or tablet and access an accurate assessment of the harvestable versus non-harvestable acres on a given day. The weight and maturity of harvestable crops can also be measured and predicted.
Additionally, using a variety of technologies, including image analysis, crops can be evaluated both before and after harvest for the presence of desirable features, extent of damage (if applicable), nutritional makeup, and other factors that may impact the ultimate viable yield and product price.
Many plants have similar leaf compositions, colors, and shapes, making it difficult to label them using the human eye. Farmers can now rely upon ML to assess complex patterns and accurately identify related plant and weed species.2
Digital identification of plant species saves farmers time, allowing them to increase productivity in other critical areas.
CROP DISEASE AND WEED DETECTION
ML-driven image processing allows farmers to rely upon digital tools to recognize weed species and to determine which crops are healthy and which ones are infested with disease caused by fungi, bacteria, or viruses.
The ability to identify weeds with digital tools makes it possible to train mechanical devices (robots) to pull weeds from fields, protecting the environment from damage caused by pesticide use and saving farmers time, effort, and money.
Additionally, digital applications that can evaluate crops for disease can also provide an accurate disease diagnosis and recommend an optimal treatment plan.
This technology helps farmers avoid settling on a one-size-fits-all solution that not only fails to address a specific disease, but may inadvertently cause undesirable side effects, such as pollution or bee-population reduction. It also allows the companies that manufacture crop disease treatment products to better serve their customers.
Today’s farmers rely upon ML-enhanced technology to:
Map and estimate yields
Better meet demand without unnecessary waste
Make smarter harvesting and pricing decisions
Identify and automatically remove harmful weeds
Find and treat crop disease with targeted solutions
Accurately classify weed species
Increase productivity, save time, and operate more economically
… and the list continues to grow!
Those who work with livestock are also experiencing time and cost savings as a result of ML-driven technology.
ML technology currently improves farming and ranching operations in a variety of areas, including livestock health maintenance, dairy and egg production, animal herding, and selective breeding.
LIVESTOCK HEALTH MAINTENANCE
ML helps farmers maintain happy and healthy herds of cattle.
In one use case, a company in Amsterdam uses sensors to monitor cow behavior. ML analysis of the gathered data predicts fertility patterns, diagnoses eating disorders, and alerts farmers to signs of heat stress.4
In another example, a dairy cow’s health status can be evaluated by applying deep learning algorithms to images of white blood cells extracted from the cow’s blood or milk. This process provides indications of certain health issues earlier than simple observation provides. Farmers can then initiate appropriate wellness measures before antibiotics become necessary.4
DAIRY AND EGG PRODUCTION
ML-driven data analysis also helps ranchers optimize operations to more accurately and efficiently manage production of milk, eggs, and other foods.2
For example, the same technology that gathers and analyzes data on dairy cow activity also allows farmers to make operational decisions that improve milk output by up to 30%.5
Autonomous robots aren’t just monitoring fields of crops for weeds and diseased plants. They’re also being trained to herd cattle and sheep.
In addition to physically herding animals toward a desired destination, these ML-driven devices can haul heavy objects from one place to another and cooperate with drones to relay critical information to farmers.6
Selective breeding involves the use of genetic data to optimize livestock pregnancy rotations and encourage the perpetuation of favorable traits, such as milk quality, disease resistance, fertility, and more.
Selective breeding is not new. Ranchers have relied upon observable factors to produce livestock lines that possess desirable characteristics for centuries. Today, with the assistance of ML-supported sensors and applications, a vast amount of data regarding genetic molecular markers, environment, feed makeup, birth patterns, and more can be analyzed, and ranchers are able to make livestock-mating decisions with significantly more accurate results.
Ranchers rely upon ML-enhanced technology to:
Analyze livestock behavior patterns and provide better care and feeding recommendations
– Diagnose disease and preemptively apply appropriate treatments
– Optimize milk and egg production
– Herd cattle and sheep
– Perform manual labor
– Evaluate and communicate farm-condition data
– Predict fertility patterns and improve breeding selection to produce hardier stock
– Increase productivity, save time, and operate more economically
AGRONOMY, BREEDING, AND BIOTECHNOLOGY
Agronomy, breeding, and biotechnology include practices that help improve agriculture operations and outputs, including water and soil management, hybrid plant optimization, and sustainable agrochemical production and application.
WATER AND SOIL MANAGEMENT
Through ML-assisted analysis of precipitation and evapotranspiration (the process by which water transitions from soil and plant transpiration to the atmosphere), technologists develop more efficient resource management procedures and irrigation systems.2
ML is equally well equipped to analyze data regarding soil conditions, including moisture level, temperature, and chemical makeup, all of which have an impact upon crop growth and livestock well-being.
As with humans, plants’ characteristics are determined by their genes. Certain genes help plants absorb water and nutrients better than others, while others help them fight disease more effectively. Some genes even affect how a plant may end up tasting!
An entire industry revolves around developing commercial seed products that combine the best features of various plant strains.
Without the aid of ML-based technology, a single hybrid development cycle can take scientists seven or eight years (although this is still faster than the speed at which nature performs the process!).
By evaluating masses of data on plant performance in various conditions over time, ML algorithms help scientists better optimize the identification of biotech traits needed to profitably increase yields, given the likelihood of harmful environmental factors, such as unfavorable weather conditions and insect populations, in a given season. This optimized use can also improve the longevity of these hugely beneficial and expensive-to-create biotech traits by reducing resistance buildup.
Simply put, ML helps scientists make predictions regarding which gene combinations will lead to desirable traits in new plants, providing an excellent starting point for developing hardier (and possibly more flavorful!) plant species.
AGROCHEMICAL PRODUCTION AND APPLICATION
One of the key strengths of ML technology is predictive analysis. Companies that develop chemical and biochemical products for farmers and ranchers, such as pesticides, crop disease treatments, antibiotics, microbials, and more, rely upon ML to assist them in ensuring that product efficacy is maximized and that the environmental impact of the products they place on the market is minimized.
Additionally, as mentioned earlier, ML simplifies and streamlines the entire crop-disease identification, diagnosis, and treatment process.
The mobile app capable of providing farmers plant disease diagnoses in real time also delivers product recommendations. Order fulfillments can then be handled via ML-assisted digital processes, and spray drones programmed to recognize those crops that require treatment can pinpoint delivery of the right products to the right plants without disturbing neighboring, healthy or unaffected plants.
Plant disease isn’t the only threat to crops and livestock that ML helps professionals combat. By using a similar image-analysis algorithm, pest control companies provide their associates a reliable, real-time tool to identify bugs, allowing them to provide targeted extermination services.7
AGRONOMY, BREEDING, AND BIOTECHNOLOGY SUMMARY
Technologists rely upon ML-enhanced technology to:
– Analyze and optimize water and soil resources
– Develop better hybrid plant species
– Reduce harmful environmental impacts of pesticides and other products
– Remotely diagnose crop disease and provide targeted solutions
– Deliver crop disease treatments and pesticides with pinpoint accuracy via spray drones
– Identify pests and provide targeted solutions
– Increase productivity, save time, and operate more economically
Borrowed with permission from the thought leaders at Object Computing Inc.
For more information contact email@example.comRead More
There is a time and a place for virtual, self-paced training solutions, but if you want to build confident teams and lower the risk of project failure, instructor-led training is key.
With the goal of helping our clients look at the human side of analytics programs, we advocate the use of training for data teams. Although self-paced training is available for all concepts and technologies, there are cases when instructor-led is much more effective. This pertains to public classes but becomes even more valuable with customized training developed specifically based on an organization’s goals and requirements. Benefits of instructor-led training include:
- Enhanced comprehension of training materials based on human engagement.
- Encourages team member collaboration and ideation.
- Unlike virtual training, students have the ability to ask for further clarification, 1×1 with the instructor if needed.
- Networking and “different points of view” if the training is public.
These benefits also apply to virtual instructor-led training that is sometimes the only viable solution for global teams.
It is also key that you consider training over and above technical/certification training. Great Data Minds has gathered together (and continues to add) instructor-led workshops for data teams. See our live training workshops here.Read More
Reality or not, the perception nowadays is that data modeling has become a bottleneck and doesn’t fit in an agile development approach. Plus with NoSQL being “schema-less”, perception often is that there is no need for data modeling ahead of coding. You may pretend that it is not happening. Or blame complexity, speed of change, culture, or developers’ mentality. Or argue that data modeling is actually agile.
In the meantime, data modelers feel left out of the development process… because they are! They fear for their jobs, long term if not sooner. This is a recurring theme we sense at every Fortune 500 company across the US and Europe when we give our training ‘Agile Query-Driven Data Modeling for NoSQL’.
The reality is that data modeling needs to be re-invented in order to remain relevant. And since there is so much baggage associated with the term “data modeling”, maybe we should give it a less threatening name, such as “schema design”?
Here, the purists generally stop me to say: “Wait, you can’t go straight into physical modeling without doing first the conceptual then logical models.” Well… maybe, but that’s part of the issue. If you can’t demonstrate that you facilitate speed to market, then you’re viewed as being in the way, and autonomous agile teams will try to get around you.
Logical modeling is counter-productive (for NoSQL)
Working our way backwards in the traditional sequence: conceptual -> logical -> physical, we all know by now that schema design is actually more important with NoSQL than with relational databases, since JSON is so powerful and flexible, but not so forgiving.
Logical modeling makes sense when aiming to achieve an application-agnostic database design, which is still best served by relational database technology. But when designing a NoSQL database, which should be application-specific to leverage the benefits of the technology, it becomes apparent that logical modeling is a counter-productive step. Since logical modeling is supposed to be normalized while NoSQL schema design will be mostly denormalized, why go through the logical modeling exercise at all?
Some sort of conceptual modeling continues to be required to document the understanding and blueprint of the business. But when dealing with NoSQL and agile development, we propose that Domain-Driven Design should replace conceptual modeling. Then, driven by business rules and application screens, reports and queries, we can map directly from domain aggregates in bounded contexts of DDD to the design of the NoSQL physical schema, thereby bypassing logical modeling.
Domain-Driven Design helps avoid “big balls of mud”
Creating an enterprise model is achievable for the initial incarnation of software systems. But without care and attention, inherent domain and technical complexity will, over time, turn monolithic applications into a pattern known as the “big ball of mud“. Change is risky, and the best developers spend valuable time fixing technical complexity and technical debt, instead of adding value in domain evolution.
Domain-Driven Design is a language- and domain-centric approach to software design for complex problem domains. It recognizes that over time, an enterprise conceptual model will lose integrity as it grows in complexity, as multiple teams work on it, and as language become ambiguous. With DDD you decompose complex problems so you can be effective at modeling bounded contexts that are defined with unity and consistency. DDD promotes the use a Ubiquitous Language to minimize the cost of translation between business and technical terminology and to enable deep insights into the domain thanks to a shared language and collaborative exploration during the modeling phase.
DDD consists of a collection of patterns, principles, and practices that enable teams to focus on what’s core to the success of the business while crafting software that tackles the complexity in both the business and the technical spaces. One such pattern is an aggregate, a cluster of domain objects that can be treated as a single unit, for example an order and its order lines.
Domain-Driven Design maps directly to the concepts of Agile and NoSQL
There’s nothing in agile to suggest that one should skip design. It suggests that design should be evolutionary and iterative. DDD also encourages an iterative process, first at a strategic level to divide the work and focus on what’s important to the business, then at a tactical level to understand the details of each bounded context.
On the database side, relational modeling is vastly different than the types of structures that application developers use. Database joins slow down performance and lead to object-relational impedance mismatch, causing developers to move away from relational modeling and towards aggregate models. When an aggregate is retrieved from the database, the developer gets all the necessary related data, thereby facilitating manipulations.
A NoSQL document structure corresponds to the structure of a programming object in a much better way than a relational database does, and at the same time, can closely represent DDD aggregates of domain objects.
Back to our proposal that logical modeling should be avoided, why would you break down domain aggregates into normalized entities, only to re-assemble them again during the physical schema design process?
If you had a logical model, how would you go about doing your NoSQL schema design with no knowledge of what queries and reports will look like? In other words, how would you perform entities aggregation without the context of the application screens and their content?
Document schema design
Having defined the aggregates of a bounded context, it is necessary to create additional artifacts: mainly a pragmatic charting of workflows and business rules (not a full BMPN that would be hard to produce, maintain, and digest), plus mockups (or wireframes) for application screens and reports. What’s important here is to not fall in the same traps as reviewed earlier with enterprise data models! But the creation of these artifacts tends to reveal points of attention that may have been overlooked in the DDD phase.
Based on the above streamlined process, the actual schema design step should be clearer. But the flexibility and power of JSON is the next challenge. It seems so intuitive at first that is easy to overlook the potential traps.
Say you’ve agreed to denormalize and aggregate information into one document. The next question is “how?” There are probably as many different ways to do it as you have members on your team: do you embed locally all related entity data? Or do you embed a partial duplicate or snapshot of remote entity data? Or do you refer to remote entity data, with one- or two-way referencing?
Here are a few factors influencing choices in relationship expression:
- cardinality: does high cardinality lead to practical or technical issues?
- strength of entity relationships: do they all conceptually belong together?
- query atomicity: what info needs to be returned together?
- update atomicity: must it all change together?
- update complexity: what’s the impact if data is duplicated? How do we avoid data inconsistency?
- document size: how much time will it take to load? Are we in a mobile environment where data traffic matters? Will the document size grow indefinitely?
- coding complexity: does it all make sense in the code?
The added-value of Data Modelers
Beyond the provocative nature of the headline, the exercise of designing a NoSQL database is obviously far from trivial. The dynamic and evolutive nature of a JSON structure is a wonderful opportunity that should not be spoiled by a careless approach. While developers are certainly capable of doing their own schema design, is it really the best allocation of resources? In enterprises dealing with any kind of application complexity, it becomes quickly obvious that data modelers can be tremendous contributors to the quality of agile development.
Years of experience in data modeling of relational databases have trained them to naturally:
- focus on the core business use case
- create pragmatic models without being over ambitious or perfectionist
- reveal hidden insights and simplify
- experiment with different designs to reach a flexible solution
- challenge assumptions and look at things from a different perspective
- facilitate the dialog between application stakeholders
Data modeling is no longer an exercise taking place just in the early stage of an application lifecycle. Data modeling is now part of the iterative agile development and continuous integration loop, adding value every step of the way.
Even in production, data modeling is used to reverse-engineer all production NoSQL databases to discover new fields and structures that may have been added, providing unique documentation of unstructured and semi-structured data – a critical factor in the context of GDPR and privacy regulations.
As usual when a major shift is under way, there are 2 possible approaches: resist change, or embrace it. Data modelers should not fear agile development. They should enthusiastically embrace change, become the developers’ best friends, and demonstrate their tremendous added value to achieve together higher quality applications.
:: Borrowed with author’s permission from the original post. ::Read More
“Nothing is as painful to the human mind as a great and sudden change.” – Mary Shelley British Novelist, Author of Frankenstein
Transformation is hard, but it’s a necessity in these accelerated, innovative times. At the forefront of transformation projects is technology and process change. Rarely does management take a hard look at the human side of transformation and this is often what will delay or even kill a project.
Human transformation takes investment, but is well worth the time and effort. Even with that investment, you can expect to lose an average of 30% of your team, as some will choose not to embrace your new direction. However, corporate knowledge is invaluable and if you can guide your current team through the changes, retaining key resources will always be the best route to success.
True story: At one of my past companies, we were awarded an MPP replacement project by a Fortune 500 organization. After working closely with management on requirements and a final deployment plan, we were asked to lead a project kick-off meeting with the current team that was supporting their Teradata environment. As we started to discuss the use of Cloud services and supporting agile tools for ELT you could feel the frustration rising in the room. One hour into a half-day meeting three of the team members explosively walked out. The project was put on hold for two months as the organization regrouped and took a hard look at the current team, gaps and the changes that needed to be made.
Our lesson as a consulting vendor was to help our clients early on with the human side of transformation. We came up with a few guiding principles that apply to all organizations.
1. Communicate Change Long Before the Project Begins
Assuming the team will just go along with change is the number one problem. Humans work much better with knowledge. Communicate to your teams starting with the ideation phase. Heck, even make them part of that phase through a Design Thinking session. Continue those communications throughout the project and even after deployment.
2. Education is Key
Develop a formal program for the education of your team. Start with basics such as what is driving this change. Not just in your organization, but also in the world. Technical training is key and in these times it means learning additional languages. There should also be an emphasis on agile. Consider “outside of the technical box” training such as Data Storytelling, or even enhanced visualization training. Sit back and watch how appreciative and excited your team is.
3. Promote Armchair Analytics
It is hard to ask your team to give you more than 40 hours, but there are those that thrive on learning new technologies. We made sure each team member had credits from top cloud vendors (usually the first $300 are free) so they could dive into the technology if they wanted to in their spare time. You will be surprised how many folks appreciate this and take advantage.
4. Stand Up Sandboxes with Meaning
We always had special team projects being developed in our Cloud environments that were governed by our management team. Resources were granted hours out of their regular 40 to develop IP for our company while learning new technologies and sharing ideas with their peers.
5. Encourage Mentoring
Inherently, humans want to help humans. Encourage mentoring across teams. We have named our mentoring program “Mentoring 360” to stress that it encourages resources to mentor each other, not just one way.
6. Spend a Bit More Time with Those that Are Struggling
We have seen meltdowns over and over again as innovation is introduced to the organization. There will be those that frustrated and even bitter. As a manager, take extra time for these resources. We have proven that one-on-one coaching can be helpful.
These are just a few recommendations. Of course, formalized change management can’t be ignored but humanizing transformation and implementing the above suggestions will produce results and make for a happier, much more effective team.Read More
Here at Great Data Minds, we love to say that we will “change the world through the use of data”. Hearing this line by itself, you could definitely accuse us of being faciful, maybe even a “pie in the sky” data-lovers. However, we believe and embody that you can, indeed, change the world through data because we have seen radical success stories during our years of working with data.
One particular project that has always been near and dear to our hearts is the work that Michael Ames and his team at the Colorado Center for Personalized Medicine did.
By modernizing their data practices and technologies, they were able to:
- accelerate the time to insights to support their research and healthcare professionals
- save millions of dollars per year. That money now goes back to true research (on childhood cancer) as opposed to being spent on traditional dataops.
This video gives an overview of the project:
If you would like more information about this case study or would like to discuss this work further, contact us at firstname.lastname@example.org.Read More
Consider this scenario: You are the CIO of a medium-sized company with hundreds of employees. You’ve been tasked to modernize the company’s technology portfolio. You ask your senior managers to provide recommendations as to where the company should invest its technology budget. After some time, you hear a lot of well-thought out proposals and make an educated decision based on the information you have at the time. Let’s say you choose Technology A.
Fast forward five years and millions of dollars in investment.
The technology space has seen massive growth and the direction of the company has drastically changed from five years ago. You soon come to the realization that Technology A is no longer an efficient way to solve the company’s technical challenges. Rather than switch courses to a more efficient technology, i.e. Technology B, you plow ahead using inefficient Technology A because you’ve already sunk so much money into it that you feel you need to continue down the path that you chose five years ago.
This example illustrates the sunk cost fallacy and is commonly seen in technology industries.
What is the sunk cost fallacy?
It’s the idea that a commitment to an idea requires performance despite extra costs that arise. A simpler example might be when you spend $10 on an extraordinarily large sandwich and feel like you have to eat the entire sandwich so that you maximize what you paid for. Or, say you are waiting for a late bus and decide to keep waiting only because you’ve already waited for 30 minutes.
We’ve seen this issue throughout our careers. Huge companies become virtually paralyzed because their systems are ancient and fragile, yet they are resistant to change because of the massive investments they’ve made over the years.
Most recently, we’ve seen it with cloud computing. Companies who stand to gain considerable efficiencies and cost savings from moving their on-premise infrastructure to the cloud fail to do so because they have an emotional connection to their past decisions.
Why is the sunk cost trap bad?
The sunk cost trap exposes biases you might have towards remaining in a situation or with a technology purely because you’ve already made investments into it.
For some, these biases result in poor decision-making. Where logic would have you going in a different direction, you remain steadfast with your previous decisions because your emotions are telling you to protect your investment at all costs.
Persistence is often considered a valuable leadership quality. No one likes a quitter, as they say. However, if you ignore all the advice and evidence to the contrary and continue to throw good money after bad, how is that valuable leadership? In a fast-changing technology landscape, perhaps perseverance is not as good of a quality as we thought. Perhaps a good leader is one that knows when to cut losses and forge ahead in a different direction. Perhaps company cultures should be better at rewarding the admitting of mistakes over persistence with a clearly bad direction.
Why do people succumb to the sunk cost trap?
There are a variety of reasons, most of which are based on emotional connections with our past decisions. People who fall into the sunk cost trap believe that:
- Past investments need not affect future investments.
- Grit and tenacity are more important than common sense.
- Our luck will change if we stick with our commitments.
- Our past decisions were not wrong because we are too proud to admit it.
- Letting go of something we currently have is generally uncomfortable, despite there being better alternative that is readily available.
With respect to technology, the principle of sunk costs isn’t about building a company’s technology infrastructure without commitment or follow-through. There are decisions or commitments we may not feel like making today but that we know, deep down, are the right things to do for the wellbeing of a company.
But rather, it’s about spotting patterns in honoring sunk costs and making better decisions in the future. Clearing the sunk cost “baggage” (technical and otherwise) from the decision-making process will make room for technical progress and agility. Sometimes walking away is the wisest decision one can make.
How do I avoid the sunk cost trap?
The most important thing you can do is recognize that any investment you’ve made into a technology to date, financial or otherwise, should not be a part of the decision making process. You should be open to realizing a loss, if necessary, and make your decisions based on future costs and benefits.
One area where we see the sunk cost trap in action is in companies resisting migration of their on-premise data to the cloud. The benefits of moving an on-premise infrastructure to the cloud are well-documented, yet many companies resist a cloud migration because of their past investments in on-premise solutions. These benefits include:
- Increased access to computing power
- Decreased application design complexity
- Lower costs
- Faster time to market
- Improved disaster recovery
If you’re still dealing with an on-premise structure and are holding back on pulling the trigger on a cloud migration, consider that you might be rationalizing your hesitation on illogical emotional connection to past decisions and are continuing to invest in a technology that is perhaps no longer well-suited for your business.
Cloud Forward Thinking
Simply put, cloud computing is computing based on the internet. Where in the past, people would run applications or programs from software downloaded on a physical computer or server in their building, cloud computing allows people access to the same kinds of applications through the internet.
When you update your Facebook status, you’re using cloud computing. Checking your bank balance on your phone? You’re in the cloud again. Chances are you rely on cloud computing to solve the challenges faced by small businesses, whether you’re firing off emails on the move or using a bunch of apps to help you manage your workload.
In short, cloud is fast becoming the new normal. By the end of 2025, it’s estimated that 80% of all IT budgets will be committed to cloud apps and solutions.
Why are so many businesses moving to the cloud? It’s because cloud computing increases efficiency, helps improve cash flow and offers many more benefits. Here at Great Data Minds we believe wholly in cloud-based solutions.
::Originally published at https://skylarq.com/blog/cloud-sunk-cost-trap/:::Read More