Snapdeal and Bhashini Announce Partnership To Boost AI Vernacular Language Capabilities 

Snapdeal, one of India’s prominent e-commerce platforms, on Tuesday signed a Memorandum of Understanding (MoU) with Bhashini, established by the Ministry of Electronics and Information Technology (MeitY) within Digital India Corporation (DIC) to bring AI capabilities for multilingual landscape.

The partnership between Snapdeal and Bhashini will use AI to address the diverse linguistic landscape of India and make new solutions that can be used in many different languages, helping people understand and use digital services more easily. The MoU focuses on their commitment to advancing digital inclusivity in India.

Collaboration for Multilingual Capabilities

This strategic partnership leverages AI capabilities and language translation features to enhance the accessibility of digital services to users in over nine spoken languages.

“We look forward to combining Bhashini’s deep expertise in language solutions with Snapdeal’s robust e-commerce platform to empower individuals across diverse linguistic backgrounds,” said Himanshu Chakrawarti, CEO of Snapdeal.

The Snapdeal-Bhashini MoU is a promising step towards building an inclusive digital ecosystem in India. By breaking down language barriers, this partnership is set to power the e-commerce landscape too.

“We are committed to leveraging technology for the greater good and enabling inclusive digital experiences for all. Through innovative use of AI and voice-first technology, we are determined to break down barriers,” said Amitabh Nag, CEO of Bhashini.

Voice For India

Backed by the government of India, Bhashini has been constantly working towards bridging language barriers with the help of AI. Furthermore, through this platform, the government looks to develop a National Public Digital Platform for local languages, for creating services for all citizens.

Last year, Bhashini released IndicTrans2, an open-source, transformer-based multilingual model for high-quality translation across 22 Indian languages. Interestingly, the company is also open to collaborating with people working in this field. “Anyone and everyone building LLMs to serve the Indian market can get in touch with us, and we are open to partnering with them,” said Nag, in a previous interaction with AIM.

Recently, Bhashini also collaborated with Nasscom to launch the “Be our Sahayogi” program to crowdsource multilingual AI problem statements.

The post Snapdeal and Bhashini Announce Partnership To Boost AI Vernacular Language Capabilities appeared first on AIM.

5 Free University Courses to Learn Coding for Data Science

5 free university courses to learn coding for data science.
Image by Author

I spent around $30,000 on a 3-year computer science degree to become a data scientist.

This was an expensive and time-consuming process.

After graduating, I realized that I could’ve just learned all the necessary skills online instead. Top-tier universities like Harvard, Stanford, and MIT have released dozens of courses for anyone to consume.

And the best part?

They’re completely free.

Thanks to the Internet, you can now get an Ivy League education for free from the comfort of your home.

If I could start over, here are 5 free university courses I would’ve taken to learn coding for data science.

Note: Python and R are two of the most widely used programming languages for data science, and as such, most courses in this list focus on one or both of these languages.

1. Harvard University — CS50’s Introduction to Computer Science

Harvard's CS50 course is one of the most popular entry-level programming courses offered by the university.

It takes you through the fundamentals of computer science, covering both theoretical concepts and practical applications. You will be exposed to an array of programming languages, like Python, C, and SQL.

Think of this course as a mini computer science degree packaged into 24 hours of YouTube content. For comparison, CS50 covered what took me three semesters to learn at my own university.

Here’s what you will learn in CS50:

  • Programming Basics
  • Data Structures and Algorithms
  • Web Design with HTML and CSS
  • Software Engineering Concepts
  • Memory Management
  • Database Management

If you want to become a data scientist, a solid foundation in programming and computer science is required. You will often be expected to extract data from databases, deploy machine learning models in production, and build model pipelines that scale.

Programs like CS50 equip you with the technical foundation needed to progress to the next stage of your learning journey.

Course Link: Harvard CS50

2. MIT — Introduction to Computer Science and Programming

MITx's Introduction to Computer Science and Programming is another introductory course designed to equip you with foundational skills in computer science and programming.

Unlike CS50, however, this course is taught primarily in Python and places a heavy emphasis on computational thinking and problem-solving.

Furthermore, MIT’s Intro to Computer Science course focuses more on data science and the practical applications of Python, making it a solid choice for students whose sole aim is to learn programming for data science.

After taking MIT’s Intro to Computer Science course, you will be familiar with the following concepts:

  • Python Programming: Syntax, data types, functions
  • Computational Thinking: Problem-solving, algorithm design
  • Data Structures: Lists, tuples, dictionaries, sets
  • Algorithmic Complexity: Big O notation
  • Object-Oriented Programming: Classes, objects, inheritance, polymorphism
  • Software Engineering Principles: Debugging, software testing, exception handling
  • Mathematics for Computer Science: Statistics and probability, linear regression, data modeling
  • Computational Models: Simulation principles and techniques
  • Data Science Foundations: Data visualization and analysis

You can audit this course for free on edX.

Course Link: MITx — Introduction to Computer Science

3. MIT — Introduction to Algorithms

Once you’ve completed a foundational computer science course like CS50, you can take MIT's Introduction to Algorithms learning path.

This program will teach you the design, analysis, and implementation of algorithms and data structures.

As a data scientist, you will often need to implement solutions that maintain performance even as dataset sizes increase. You also have to handle large datasets that can be computationally expensive to process.

This course will teach you to optimize data processing tasks and make informed decisions about which algorithms to use based on the available computational resources.

Here’s what you’ll learn in Introduction to Algorithms:

  • Algorithm Analysis
  • Data Structures
  • Sorting Algorithms
  • Graph Algorithms
  • Algorithmic Techniques
  • Hashing
  • Computational Complexity

You can find all the lectures for Introduction to Algorithms on MIT OpenCourseWare.

Course Link: MIT — Introduction to Algorithms

4. University of Michigan — Python for Everybody

Python for Everybody is an entry-level programming specialization focused on teaching Python.

This is a 5-course learning path that covers the basics of Python, data structures, API usage, and accessing databases with Python.

Unlike the previous courses listed, Python for Everybody is largely practical. The specialization focuses on practical application rather than on theoretical concepts.

This makes it ideal for those who want to immediately dive into the implementation of real-world projects.

Here are some concepts you’ll be familiar with by the end of this 5-course specialization:

  • Python Variables
  • Functions and Loops
  • Data Structures
  • APIs and Accessing Web Data
  • Using Databases with Python
  • Data Visualization with Python

You can audit this course for free on Coursera.

Course Link: Python for Everybody

5. Johns Hopkins University — R Programming

You might have noticed that every course so far focuses on Python programming.

That’s because I’m a bit of a Python aficionado.

I find the language versatile and user-friendly, and knowledge of Python is transferable to a broad range of fields beyond just data science.

However, there are some benefits to learning R for data science. R programming was designed specifically for statistical analysis, and there are a range of specialized packages in R for parameter tuning and optimization that aren’t available in Python.

You should consider learning R if you’re interested in deep statistical analysis, academic research, and advanced data visualization. If you’d like to learn R, the R Programming specialization by Johns Hopkins University is a great place to start.

Here’s what you’ll learn in this specialization:

  • Data Types and Functions
  • Control Flow
  • Reading, Cleaning, and Processing Data in R
  • Exploratory Data Analysis
  • Data Simulation and Profiling

You can audit this course for free on Coursera.

Course Link: R Programming Specialization

Learn Coding for Data Science: Next Steps

Once you’ve completed one or more courses outlined in this article, you will be equipped with a ton of newfound programming knowledge.

But the journey doesn’t end here.

If your end goal is to build a career in data science, here are some potential next steps you should consider:

1. Practice Your Coding Skills

I suggest visiting coding challenge websites like HackerRank and Leetcode to practice your programming skills.

Since programming is a skill best developed through incremental challenges, I recommend starting with the problems labeled “Easy” on these platforms, such as adding or multiplying two numbers.

As your programming skills improve, you can start increasing the level of difficulty and solve harder problems.

When I was starting out in the field of data science, I did HackerRank problems every day for around 2 months and found that my programming skills had dramatically improved by the end of that time frame.

2. Create Personal Projects

Once you’ve spent a few months solving HackerRank challenges, you will find yourself prepared to tackle end-to-end projects.

You can begin by creating a simple calculator app in Python, and progress onto more challenging projects like a data visualization dashboard.

If you still don’t know where to start, check out this list of Python project ideas for inspiration.

3. Building a Portfolio Website

After you’ve learned to code and created a few personal projects, you can display your work on a centralized portfolio website.

When potential employers are looking to hire a programmer or a data scientist, they can view all your work (skills, certifications, and projects) in one place.

If you’d like to build a portfolio website of your own, I’ve created a complete video tutorial on how to build a data science portfolio website for free with ChatGPT.

You can check out the tutorial for a step-by-step guide on creating a visually appealing portfolio website.

&nbsp
&nbsp

Natassha Selvaraj is a self-taught data scientist with a passion for writing. Natassha writes on everything data science-related, a true master of all data topics. You can connect with her on LinkedIn or check out her YouTube channel.

More On This Topic

  • KDnuggets News, May 4: 9 Free Harvard Courses to Learn Data…
  • 5 Free University Courses to Ace Coding Interviews
  • 5 Free University Courses to Learn Data Science
  • 5 Free Stanford University Courses to Learn Data Science
  • 5 Free University Courses to Learn Computer Science
  • 5 Free University Courses to Learn Databases and SQL

Snowflake Looks to Upskill Developers in India’s Rural Towns

Snowflake Looks to Upskill Developers in India’s Rural Towns

Snowflake is making significant strides in India, with a vision that transcends traditional go-to-market strategies. AIM caught up with Vijayant Rai, the managing director of Snowflake India, at Snowflake’s Data Cloud Summit 2024. Rai elaborated on the company’s multifaceted approach to establishing a robust presence in the country.

“We’re looking at India in a multi-dimensional way,” Rai said, underscoring the company’s diverse operations. This includes a significant presence in Pune, where a team of 500 professionals handles operations and support. Additionally, Snowflake is leveraging India as a hub for global customers through its Global Capability Centers (GCCs).

“It’s not just about go-to-market; it’s also about what we can do for global customers from India,” he explained. Snowflake aims to be an integral part of this transformation by partnering with enterprises and SMBs to drive innovation.

The company’s engagement with India is driven by the country’s rapid economic growth and digital transformation. “India is the fastest growing global economy, with a growth rate of over 7%,” he said. Rai noted that the widespread adoption of digital public goods like the UPI framework has positioned India as a data-driven economy.

The AI in India Approach

Snowflake’s approach to AI is pragmatic, focusing on laying strong data foundations. “There’s no AI strategy without a data strategy,” Rai asserts. The company is helping enterprises break data silos and establish robust data governance frameworks. This is essential for leveraging advanced technologies like generative AI, which Rai believes will revolutionise how businesses operate.

Snowflake is launching boardroom-level workshops to support this transformation and educate senior management on devising effective data strategies. “We believe it’s part of our charter to educate the market,” Rai said. These workshops are designed to ensure that enterprises can maximise the potential of AI and other emerging technologies.

Snowflake is also addressing the demand for skilled developers by offering extensive training and certification programs. These initiatives extend beyond major cities to Tier-2 and Tier-3 locations, even small villages in India, reflecting Snowflake’s commitment to democratise access to AI.

“India is front and centre in our strategy,” Rai affirmed, highlighting Snowflake’s dedication to making a meaningful impact in the country.

The Tech Talent Prowess

One of Snowflake’s key initiatives is to leverage India’s talent and technological prowess. “We see India not just as a support hub but as a centre for innovation, especially in AI and data-driven technologies,” Rai stated.

He highlights the significant role of GCCs, with over 600,000 tech professionals in India driving data innovation. Snowflake is committed to supporting these centres to scale and enhance their capabilities.

The company is also focused on nurturing the developer community in India. “We’re investing heavily in skilling and ensuring that people are exposed to the Snowflake platform and various aspects of data and AI,” Rai said. This includes initiatives like language support in their large language model, which accommodates all major Indian languages.

Teams from Pune and other locations are already contributing to Snowflake’s global projects, and this collaboration is set to deepen over time.

In terms of market strategy, Rai emphasised the importance of understanding local business cultures and nuances. “We have experienced teams in Delhi, Bengaluru, and Mumbai who have worked in various verticals and understand the unique needs of different industries,” he said.

This local expertise is crucial in navigating the fast-paced technological changes that Indian enterprises are embracing.

Snowflake’s Shift to Generative AI Paying Off

Snowflake has been on an acquisition spree and much of its focus is on expanding its generative AI capabilities. Ever since Sridhar Ramaswamy joined as the CEO of Snowflake after the acquisition of Neeva, generative AI has been one of its biggest focuses.

In an exclusive interview with AIM, Snowflake head of AI Baris Gultekin said that he had worked with Ramaswamy for over 20 years at Google, and called him an incredible leader. “Sridhar brings incredible depth in AI as well as data systems. He has managed super large-scale data systems and AI systems at Google,” Gultekin said.

In addition, Microsoft announced an expanded partnership with Snowflake, aiming to deliver a seamless data experience for customers. As part of this, Microsoft Fabric’s OneLake will now support Apache Iceberg and facilitate bi-directional data access between Snowflake and Fabric.

Moreover, in a recent interview, Ramaswamy revealed that the cloud data company plans to deepen its collaboration with AI powerhouse NVIDIA. “We collaborated with NVIDIA on a number of fronts – our foundation model Arctic was, unsurprisingly, done on top of NVIDIA chips. There’s a lot to come, and Jensen’s, of course, a visionary when it comes to AI,” Ramaswamy said.

The post Snowflake Looks to Upskill Developers in India’s Rural Towns appeared first on AIM.

Using SQL with Python: SQLAlchemy and Pandas

Using SQL with Python: SQLAlchemy and Pandas cover image
Image by Author

As a data scientist, you need Python for detailed data analysis, data visualization, and modeling. However, when your data is stored in a relational database, you need to use SQL (Structured Query Language) to extract and manipulate the data. But how do you integrate SQL with Python to unlock the full potential of your data?

In this tutorial, we will learn to combine the power of SQL with the flexibility of Python using SQLAlchemy and Pandas. We will learn how to connect to databases, execute SQL queries using SQLAlchemy, and analyze and visualize data using Pandas.

Install Pandas and SQLAlchemy using:

pip install pandas sqlalchemy

1. Saving the Pandas DataFrame as an SQL Table

To create the SQL table using the CSV dataset, we will:

  1. Create a SQLite database using the SQLAlchemy.
  2. Load the CSV dataset using the Pandas. The countries_poluation dataset consists of the Air Quality Index (AQI) for all countries in the world from 2017 to 2023.
  3. Convert all the AQI columns from object to numerical and drop row with missing values.
# Import necessary packages  import pandas as pd  import psycopg2  from sqlalchemy import create_engine     # creating the new db  engine = create_engine(      "sqlite:///kdnuggets.db")     # read the CSV dataset  data = pd.read_csv("/work/air_pollution new.csv")    col = ['2017', '2018', '2019', '2020', '2021', '2022', '2023']    for s in col:      data[s] = pd.to_numeric(data[s], errors='coerce')        data = data.dropna(subset=[s])
  1. Save the Pandas dataframe as a SQL table. The `to_sql` function requires a table name and the engine object.
# save the dataframe as a SQLite table  data.to_sql('countries_poluation', engine, if_exists='replace')

As a result, your SQLite database is saved in your file directory.

Deepnote file manager

Note: I am using Deepnote for this tutorial to run the Python code seamlessly. Deepnote is a free AI Cloud Notebook that will help you quickly run any data science code.

2. Loading the SQL Table using Pandas

To load the entire table from the SQL database as a Pandas dataframe, we will:

  1. Establish the connection with our database by providing the database URL.
  2. Use the `pd.read_sql_table` function to load the entire table and convert it into a Pandas dataframe. The function requires table anime, engine objects, and column names.
  3. Display the top 5 rows.
import pandas as pd  import psycopg2  from sqlalchemy import create_engine     # establish a connection with the database  engine = create_engine("sqlite:///kdnuggets.db")     # read the sqlite table  table_df = pd.read_sql_table(      "countries_poluation",      con=engine,      columns=['city', 'country', '2017', '2018', '2019', '2020', '2021', '2022',         '2023']  )     table_df.head()

The SQL table has been successfully loaded as a dataframe. This means that you can now use it to perform data analysis and visualization using popular Python packages such as Seaborn, Matplotlib, Scipy, Numpy, and more.

countries air pollution pandas dataframe

3. Running the SQL Query using Pandas

Instead of restricting ourselves to one table, we can access the entire database by using the `pd.read_sql` function. Just write a simple SQL query and provide it with the engine object.

The SQL query will display two columns from the "countries_population" table, sort it by the "2023" column, and display the top 5 results.

# read table data using sql query  sql_df = pd.read_sql(      "SELECT city,[2023] FROM countries_poluation ORDER BY [2023] DESC LIMIT 5",      con=engine  )     print(sql_df)

We got to the top 5 cities in the world with the worst air quality.

         city  2023  0       Lahore  97.4  1        Hotan  95.0  2      Bhiwadi  93.3  3  Delhi (NCT)  92.7  4     Peshawar  91.9

4. Using the SQL Query Result with Pandas

We can also use the results from SQL query and perform further analysis. For example, calculate the average of the top five cities using Pandas.

average_air = sql_df['2023'].mean()  print(f"The average of top 5 cities: {average_air:.2f}")

Output:

The average of top 5 cities: 94.06

Or, create a bar plot by specifying the x and y arguments and the type of plot.

sql_df.plot(x="city",y="2023",kind = "barh");

data visualization using pandas

Conclusion

The possibilities of using SQLAlchemy with Pandas are endless. You can perform simple data analysis using the SQL query, but to visualize the results or even train the machine learning model, you have to convert it into a Pandas dataframe.

In this tutorial, we have learned how to load a SQL database into Python, perform data analysis, and create visualizations. If you enjoyed this guide, you will also appreciate 'A Guide to Working with SQLite Databases in Python', which provides an in-depth exploration of using Python's built-in sqlite3 module.

Abid Ali Awan (@1abidaliawan) is a certified data scientist professional who loves building machine learning models. Currently, he is focusing on content creation and writing technical blogs on machine learning and data science technologies. Abid holds a Master's degree in technology management and a bachelor's degree in telecommunication engineering. His vision is to build an AI product using a graph neural network for students struggling with mental illness.

More On This Topic

  • How to Select Rows and Columns in Pandas Using [ ], .loc, iloc, .at…
  • How To Speed Up SQL Queries Using Indexes [Python Edition]
  • Query Your Pandas DataFrames with SQL
  • SQL in Pandas with Pandasql
  • Using the apply() Method with Pandas Dataframes
  • Using ChatGPT to Learn SQL

Wakefit Unveils Wakefit Zense, India’s First AI-powered Sleep Solutions Suite

Leading D2C home solutions company Wakefit has introduced Wakefit Zense, India’s first AI-powered sleep solutions suite. The flagship products, Regul8 and Track8, aim to improve sleep health and quality for Indian consumers.

Regul8 is India’s first mattress temperature controller, which allows users to manually set the temperature between 15°C and 40°C or choose from presets like neutral, cold, warm, ice, and fire. The ‘Smart Sleep Control’ feature adjusts the temperature automatically based on sleep duration and other relevant parameters.

It also supports dual preferences, so individuals can customise their sides of the bed independently. It offers seasonal versatility, providing cooling in summer and heating in winter, and is compatible with any bed and mattress type. Additionally, it is 60% more energy-efficient than a 1.5-ton air conditioner.

On the other hand, Track8 is an AI-powered contactless sleep tracker that provides precise insights into sleep patterns. By using a discreet sensor sheet under the mattress, Track8, as the name suggests, tracks sleep without the need for wearables like that of WHOOP, and Fitbit, among others.

Track8 also uses AI and machine learning with algorithms to analyse sleep data and provide detailed insights. Users receive comprehensive sleep reports detailing sleep stages, respiratory rate, snoring, movement, and overall sleep quality, culminating in an aggregated sleep score the next day.

Chaitanya Ramalingegowda, director and co-founder, stressed the ambition to build the latest sleep technology specifically for India, saying, “It has always been our dream to build cutting-edge sleep technology specifically for India. Regul8, and Track8, are two of many products we are launching as part of the Zense range. I am particularly proud that we have built these products in-house and made them accessible and affordable to Indians. By leveraging AI and innovative design, we aim to address the unique sleep challenges faced by consumers.”

Along the same Yash Dayal, chief technology officer and also the project lead for Zense highlighted the user-centric design and infusion of the latest technology of Regul8 and Track8, stating, “ The two products which have been introduced in India for the first time leverage advanced technology with user-centric design and represent our goal to make sleep personalised, manageable, and accessible to customers.”

Regul8 is priced at Rs 44,999 and it will cost Rs 10,499 for Track8. However, both products are available at a discounted launch price for now.

Technology has always been a key forte for the company as seen but this is the first time that it is providing a technology-enabled solution for customers. Looking ahead, Wakefit plans to continue addressing specific sleep issues with targeted solutions, backed by extensive R&D.

Founded in 2016, Wakefit has secured $145.4 million across five funding rounds, with the latest Series D round completed on January, 2023. The company is backed by five investors, including the most recent ones, Verlinvest and Susquehanna International Group (SIG).

The post Wakefit Unveils Wakefit Zense, India’s First AI-powered Sleep Solutions Suite appeared first on AIM.

SAP Partners with Mistral AI to Enhance Enterprise Software with Generative AI Solutions 

SAP has announced a partnership with Mistral AI to integrate advanced AI capabilities into its applications and enterprise software.

This collaboration will enable SAP customers to access Mistral AI’s models through SAP Business Technology Platform (SAP BTP) applications with generative AI capabilities.

Mistral AI is known for its open-weight models, Mixtral 8x7B and Mixtral 8x22B, as well as its enterprise-grade “Large” model. These models will be available through SAP’s generative AI hub in SAP AI Core, offering SAP customers the tools to enhance productivity, streamline operations, and accelerate digital transformation.

“We are excited about entering a partnership with Mistral AI and making the company’s LLM accessible to both our developers and our customers through the generative AI hub in SAP AI Core on SAP BTP,” said Philipp Herzig, Chief AI Officer, SAP.

“Together, we can truly make a difference by building AI-enabled solutions that create immediate value for users, organsations, and entire industries. We are particularly proud that two European technology companies are collaborating on bringing AI forward,” he added.

“We are pleased to embark on this partnership with SAP. We foresee the new horizons this collaboration will open up, enabling us to further our mission of making AI accessible to all. We are looking forward to witnessing the potential of our AI models to support innovation and streamline operations for SAP’s customers.” siad Arthur Mensch, CEO of Mistral AI.

Mistral AI recently raised $640 million (€600 million) at a valuation of $6 billion led by investor General Catalyst.

“We are announcing €600M in Series B funding for our first anniversary. We are grateful to our new and existing investors for their continued confidence and support for our global expansion. This will accelerate our roadmap as we continue to bring frontier AI into everyone’s hands,” said Mensch.

The post SAP Partners with Mistral AI to Enhance Enterprise Software with Generative AI Solutions appeared first on AIM.

Automation Anywhere Unveils New AI + Automation Enterprise System

Automation Anywhere, a leader in AI-driven automation, unveiled its AI + Automation Enterprise System at Imagine 2024. This new system incorporates second-generation GenAI Process Models to accelerate the discovery, development, and deployment of AI process automations.

The company also launched new AI Agents capable of handling complex cognitive functions. These solutions are designed to enhance organisational efficiency, slashing the time of process tasks from hours to minutes and increasing business impact up to tenfold in areas like customer service, finance, IT, and HR.

A significant new feature is the ability to build custom AI Agents with the new AI Agent Studio which is generally available and provides developers of all levels with low-code tools to easily build, manage, and govern custom AI Agents.

Developers can start with the foundational model of choice, including models from AWS, Google Cloud, and Microsoft Azure OpenAI Service and more. They will also be able to augment AI Agents with enterprise knowledge through a native RAG service, and Amazon Bedrock in October.

Additionally, developers will have built-in prompt testing to ensure outputs are relevant for any use case before putting AI Agents into action.

The AI + Automation Enterprise System is powered by Automation Anywhere’s unique GenAI Process Models 2.0. These models are specially engineered for faster process discovery and automation creation, achieving 30% quicker automation setup and 50% increased automation resilience.

They also deliver 90% accuracy in document processing, surpassing what LLMs alone can deliver. The models are tuned using rich metadata from over 300 million process automations running on Automation Anywhere’s cloud-native platform.

A new set of developer automation solutions built on the new GenAI Process Models 2.0 features new Automator AI, now generally available, offering a set of GenAI products and capabilities to accelerate the automation lifecycle, making it faster and easier to build, deploy and manage automations.

One of the standout features, the Generative Recorder, improves the resilience of UI automations by automatically detecting and adjusting to changes in a source application interface in real-time, reducing downtime by up to 50%.

Additionally, the Enhanced Autopilot feature, also generally available, enables cross-functional teams to quickly convert process documentation into draft process automations using GenAI, with the capability to incorporate inputs from any mining tool in BPMN format.

Document Automation, with a nine-fold increase in customer base year over year, leverages GenAI enhancements for real-time processing of any document type, including unstructured documents and achieves more than 90 percent accuracy.

The updated system allows for rapid data capture from any document, including complex tables, supports more than 30 languages, and offers expanded model options.

The company is also enhancing its offerings with a suite of AI-powered solutions like The Automation Co-Pilot, the embed-anywhere enterprise assistant for organisations which is now conversational and in preview due to integration with Amazon Q Service.

It allows business users to work across any application and offers chat capabilities for real-time assistance, enabling users to query knowledge bases, engage AI Agents, or trigger automations directly within their workflow.

Additionally, the Service Operations Solution Accelerator, now generally available, helps teams with pre-packaged AI Agents and predefined workflows for various service operations.

New Solution Accelerators are planned for roll-out in sectors such as finance, IT, HR, healthcare, banking, and manufacturing in the coming quarters.

In a recent interview with AIM, Adi Kuruganti, chief product officer, Automation Anywhere, shared that six out of ten prominent banks in India use the company’s services. Genesys, Cargill, Salesforce, ZS, Honeywell, and Juniper Networks are some of the company’s customers, and all of them have achieved major milestones with its AI tools.

Petrobras, the largest oil and gas company in Brazil, used Automation Copilot for Business Users in a POC to optimise tax processes and realised $120 million in savings. Automation Anywhere is the preferred automation vendor for major corporations like IDFC, Nestle, Adani as well as the Indian government.

Kuruganti said, “About 50 to 60% of our global workforce, including product engineering, UX, and customer support, are based in India,” making it a key market for the company.

The post Automation Anywhere Unveils New AI + Automation Enterprise System appeared first on AIM.

UK Trails Behind Europe in Technical Skills Proficiency, Coursera Report Finds

The U.K. is the 25th most technically proficient country in Europe, a new report by online course provider Coursera has found. It sits well behind other digital leaders in the region like Germany, France and Spain.

Globally, the U.K. came 45th out of 109 countries, which is an improvement from last year’s 64th place; however, considering the government’s significant investments into digital skills, the country’s rank suggests that current efforts may need to be reassessed.

But the U.K. is by no means a reflection of Europe as a whole when it comes to technical proficiency. Switzerland came in first place on the global ranking, and European countries occupied 17 of the top 25 places. Germany, France and Spain came in 3rd, 5th and 7th, respectively.

The findings were published in Coursera’s 2024 Global Skills Report, which draws on data from more than 148 million global online course learners and other indicators including the Global Innovation Index, Labour Force Participation Rate, Human Capital Index and GDP per capita.

As an example of an area where the U.K. may be falling behind, the country has seen a lower uptake in AI upskilling in the past 12 months (961%) compared to the U.S. (1,058%) and the global average (1,060%). Donal McMahon, vice president of data science at job site Indeed, told TechRepublic earlier this year that companies around the world “are all searching for employees who know AI and can adapt to new and emerging technologies.”

SEE: Which IT Skills Are Most in Demand in Q1 2024?

Nikolaz Foucaud, managing director of EMEA at Coursera, told TechRepublic in an email: “While the U.K. boasts a strong technology services sector — one that employs over 1.7 million people — we need to significantly increase both enterprise and governmental investment in upskilling to create an internationally-competitive labour force.

“We must strive for greater collaboration between higher education institutions, government and the technology industry to meet the rapidly evolving skill requirements of the digital economy. Without this collaboration and the right level of investment, we will continue to fall behind in technical skills proficiency.”

U.K.’s digital skills shortage

The level of “skills-shortage vacancies,” where a job cannot be filled due to a lack of skills, qualifications or experience among applicants, is very high in the information and communications sector in the U.K. The figure climbed from an already high 25% in 2017 to 43% in 2022, the last year for which data is available.

SEE: Top IT Skills Trends in the U.K. for 2024

This digital skills shortage is not going unnoticed. In 2023, Red Hat surveyed IT managers in large U.K.-based enterprises about why teams were struggling with a skills shortage, and the top three reasons were:

  • High workloads preventing people from finding the time to upskill.
  • The lack of budget for training, upskilling or recruitment.
  • Teams working in silos, preventing cross-team learning opportunities.

Recent investments in the U.K.’s digital skills

The U.K. government has noted the country’s digital skills shortage, and has made a series of key investments in the past year or so to try and address it. In March 2023, the U.K. government launched its plan to make the country a science and technology superpower by 2030. More than £370 million was earmarked for boosting infrastructure, investment and skills for technologies like quantum and AI.

The following November, more than £200 million was announced to support colleges and universities to offer more training opportunities in industries including digital. This March, Science Secretary Michelle Donelan unveiled another package of more than £1.1 billion to fund 4,000 doctorates in engineering and physical sciences.

SEE: UK Tech Trends & Predictions for 2024

Microsoft has also made significant investments into bridging the U.K.’s digital skills gap. In December 2023, the tech giant announced a “multi-million pound investment” to provide AI skills training to more than one million people. It is hoped this will boost the U.K.’s AI sector by helping more people move into AI and data-related career fields.

While it may take a number of years for the impacts of these investments to come to fruition, the Coursera analysts wrote that the results of the Global Skills Report highlight how there’s still “an urgent need for targeted upskilling initiatives to ensure the workforce can meet the evolving demands of the digital economy.”

Popular and in-demand tech skills in Europe

AI

The Coursera report revealed that the number of individuals in the U.K. who enrolled for generative AI courses increased by 961% from 2023 to 2024. This reflects the growing popularity of technical roles like data analyst and software developer and the population’s interest in developing the skills necessary to fill them.

The country is over-indexing in skills like machine learning algorithms and applied machine learning, meaning that individuals are disproportionately enrolling in a given skill compared to learners globally. According to the U.K. government, the AI sector already employs more than 50,000 people in the U.K. and contributes more than £3.7 billion to the economy every year. By 2035, the U.K. AI market is forecast to grow to more than $1 trillion.

SEE: The 10 Best AI Courses in 2024

Foucaud told TechRepublic: “The meteoric rise of popularity in AI courses is largely being driven by demand for AI skills from businesses and institutions looking to capitalise on the promise of greater productivity and increased competitiveness that AI brings.

“It is also true that individuals either concerned about the threat that technological innovation may pose to their livelihoods, or excited by the prospect of acquiring cutting-edge new skills with a view to increasing seniority, salary, or both, are demonstrating an interest in AI irrespective of their organisation’s stance on the technology.

“Beyond the economic and personal development impulses behind high uptake of AI courses, there is also simply a strong interest in getting to grips with the nuances of a technology that will likely define the future of work, and have significant societal ripple effects.”

Cyber security

Cyber security is currently one of the biggest technological threats to U.K. businesses. A recent report from Microsoft and Goldsmiths, University of London found that just 13% of U.K. businesses are resilient to cyberattacks, with 48% deemed vulnerable and the remaining 39% facing high risk. This risk extends to Europe, with a 2023 study from Cisco finding that less than 10% of companies in the region are deemed mature enough to tackle today’s cybersecurity issues.

Despite the increasing adoption of AI and machine learning related skills, cyber security is not benefitting from the same popularity. The Coursera report found that European enrollments in cybersecurity courses declined by 5% in 2024, despite Europe being the region most targeted by cyberattacks.

Foucaud told TechRepublic: “IT and cyber experts report that, whilst businesses are hiring for cyber experts, the current hiring process is too reliant on university degrees, which do not on their own prepare candidates with the right cyber skills.

“To address this concern, there will be an increased need to deploy alternative forms of credential that prioritise equipping individuals with this essential skill set at speed and scale.”

Thanks to ‘Apple Intelligence’, not OpenAI, Siri Now Understands You Better

After making the world wait, Apple has finally uttered the word ‘AI’ but not without giving it its own spin by introducing ‘Apple Intelligence’.

Apple Intelligence is deeply integrated with the new Siri, allowing users to speak more naturally to Siri, thanks to its enhanced language understanding capabilities. Siri is also getting a new look, featuring an elegant glowing light that wraps around the edge of the screen when it is active.

Apple claimed that Siri users make 1.5 billion voice requests every single day. The Cupertino-based tech giant showed the world that it doesn’t need OpenAI’s help to improve Siri, and it’s pretty clear that it is not just a wrapper around ChatGPT.

Apple has developed a 3 billion parameter on-device language model and a larger server-based model accessible via Private Cloud Compute on Apple silicon servers. These models are trained using Apple’s AXLearn framework, an open-source project released in 2023, built upon JAX and XLA.

Apple doesn’t Really Need OpenAI

Apart from building in-house Apple Intelligence, the iPhone maker has partnered with OpenAI to integrate ChatGPT-powered by GPT-4o into iOS.

Apple says Siri will now be able to tap into ChatGPT’s ‘expertise’ when needed. For example, if you need meal ideas using ingredients from your garden, you can ask Siri. Upon receiving your permission, Siri will send the prompt to ChatGPT and provide you with suggestions.

It’s worth noting that ChatGPT isn’t integrated directly into Siri, rather, it’s gaining access to it. Apple has not revealed the specifics of the deal or the amount it is paying OpenAI.

Meanwhile, Tesla chief Elon Musk is unhappy with the Apple-OpenAI partnership. He said, “It’s patently absurd that Apple isn’t smart enough to make their own AI, yet is somehow capable of ensuring that OpenAI will protect your security & privacy!”

However, this claim is not true.

“Why is Elon Musk lying about how Apple’s ChatGPT integration works? He’s threatening to put iPhones in Faraday cages and claiming Apple isn’t smart enough to make its own LLMs,” said AI expert Vin Vashishta.

“He knows Apple uses its own on-device LLMs. He heard Apple say that ChatGPT is only recommended when on-device LLMs can’t handle the request, and data is only sent if users choose that option,” he added.

Moreover there is no clarity on whether OpenAI will use users’ data to train its model. OpenAI said that users can choose to connect their ChatGPT account, which means their data preferences will apply under ChatGPT’s policies.

OpenAI’s GPT-4o’s voice feature is not yet released. It would be interesting to see whether Apple would integrate that as well in the future. However, the company did drop a hint about multiple possible future partnerships.

There is a possibility that Apple might also partner with Google, especially after Google recently introduced ‘Project Astra‘ at Google I/O 2024.

Astra is a universal AI agent built on Google’s Gemini models, designed to process multimodal information, such as text, images, video, and audio, to understand context and respond in real-time. Word has it that Apple may also partner with Anthropic.

Siri Can Take Actions

The highlight of the new Siri is its ability to take actions. With Apple Intelligence, Siri gains on-screen awareness, enabling it to understand and act on what’s displayed. For instance, if a friend texts you their new address, you can simply say, ‘Add this address to their contact card’, and Siri will do it right from the Messages thread.

you hear that?
"siri can take actions on your phone"
this is the LAM you want. holy shit.

— Sully (@SullyOmarr) June 10, 2024

Moreover, Siri can now take actions within apps on your behalf. It can perform hundreds of new tasks in and across apps, including utilising Apple’s new writing and image generation capabilities. For example, you could say, ‘Show me my photos of Stacey in New York wearing her pink coat,’ and Siri will display them instantly.

The feature is quite similar to Rabbit’s LAM and what Humane Ai Pin attempted to offer. However, it’s now clear that one doesn’t require a new device to access LLMs, and smartphones aren’t going anywhere anytime soon.

its over for rabbit R 1 and humane ai slop pic.twitter.com/WbPuR7cZEJ

— NIK (@ns123abc) June 10, 2024

Recent reports suggest that Humane is actively seeking a potential buyer for its AI Pin business after it faced widespread criticism for failing to meet expectations.

Moreover, thanks to Apple Intelligence, Siri now understands your personal context. With a semantic index of your photos, calendar events, and files, as well as information from messages and emails—such as hotel bookings, concert tickets, and shared links—Siri can now find and comprehend things it couldn’t before.

For example, when you’re filling out a form, you can simply ask Siri to locate a personal document, such as your driver’s license, from your photos and auto-fill the details into the form.

Apple Intelligence enables Siri to perform hundreds of new actions within Apple and third-party apps. For instance, a user might say, “Retrieve the article about cicadas from my Reading List”, or “Share the photos from Saturday’s barbecue with Malia”, and Siri will handle the tasks effortlessly.

This was made possible through significant enhancements that Apple is making to App Intents, a framework that lets apps define a set of actions for Siri, Shortcuts, and other system experiences.

The post Thanks to ‘Apple Intelligence’, not OpenAI, Siri Now Understands You Better appeared first on AIM.

EU Rules Cause Global Rumbles for Claude, More to Follow?

Anthropic recently launched Claude in the European Union and updated its ToS (terms of service). The company highlighted policy refinements, high-risk use cases and certain disclosure requirements within its usage policy, possibly to align with the EU regulations.

Interestingly, the policy changes applied to users worldwide. Soon after, complaints about the model’s performance began surfacing from across the globe.

Why the Change?

Users noticed a marked change in the way Claude reacted to certain prompts and questioning. While there have been several theories as to why the company decided to shuffle things, the most believable seems to be that Anthropic is trying to anticipate the upcoming EU AI Act, thanks to its recent deployment in the region.

Like one Reddit user said, the rest “is just a cheap conspiracy. The new ToS is because they are finally deploying to the EU, and therefore need to comply with this,” pointing to the EU’s Artificial Intelligence Act (AIA).

Anthropic has gone all-in on creating a more holistic policy, ahead of their launch in the EU as well as more recently in Canada. However, other big tech companies have faced similar problems in the EU.

OpenAI, Meta and Others Follow

Now, Anthropic making overarching policy changes to fit in with EU standards isn’t unwarranted. The region has been notorious for cracking down on companies not following through with the regulations.

Case in point, OpenAI was recently in hot water when an Italian regulatory body accused the company of violating the EU privacy laws. In January this year, the company was subjected to a fact-finding initiative by Italy’s Data Protection Authority (DPA), where they alleged that user data had been used to train OpenAI’s ChatGPT.

This, they said, was in violation of the EU General Data Protection Regulation (GDPR).

Similarly, Meta updated its privacy policy, stating, “To properly serve our European communities, the models that power AI at Meta need to be trained on relevant information that reflects the diverse languages, geography and cultural references of the people in Europe who use them.”

However, this was also flagged by an Austrian privacy organisation, NYOB, stating that this also violated EU GDPR.

With countries in the EU closely following AI companies on how they implement their policies, Anthropic’s need for such a drastic change makes sense. But whether this change is doing good overall is up for debate.

How Bad is the Change?

As per the updated usage policy, Anthropic prohibits the usage of its services in compromising child safety, critical infrastructure, and personal identities. They have also barred making use of their products to create emotionally and psychological harmful content, as well as misinformation, including those used in elections.

There are several other changes made to the policy, as well as their ToS and privacy policies, including the right to request deletion of personal data and the option to opt out in case of data selling to third parties.

While most would be happy about stricter data privacy policies, users have reported that Claude is performing significantly worse this year. Particularly, with respect to the use cases in the updated usage policy.

“Some stuff that’s very open to interpretation or just outright dumb. Want to write some facts about the well-documented health risks of obesity? You’d be violating the “body shaming” rule. You can’t create anything that could be considered ‘emotionally harmful’,” one Reddit user said.

Further, they said that this would be worse to determine, considering there is no guarantee that those reviewing violations would be unbiased or neutral in terms of political misinformation.

Additionally, sexually-explicit content generation has also been significantly restricted. One user said that a story they had been working on with Claude had stopped progressing because Claude refused to continue, stating that it was uncomfortable with the prompt.

This was further backed by several users who stated the same issue, including one who said that Claude refused to comply with providing quotes from certain fictional characters, citing copyright infringement.

“You can’t ‘promote or advocate for a particular political candidate, party, issue or position’. Want to write a persuasive essay about an issue that can be construed as political? Better not use Claude,” they said.

What’s the Damage?

At the moment, users are willing to give both Claude and Anthropic the benefit of the doubt. With the updated policies, seemingly also due to the EU AI Act, Anthropic has made it easier to flag issues with their products and data privacy concerns.

This includes two emails, including one for Anthropic’s Data Protection Officer (DPO), to raise complaints or offer feedback, which was not present in the previous iteration of their policy.

Similarly, users believe that while Claude seems to have been handicapped by the new ToS, this could be reverted if given enough time and if the issues are raised by the users. “Anthropic does seem willing to listen to user feedback – and we’ve seen with the release of the Claude 3 models the dialling back of the refusals. So I think, at some point in the future, Anthropic will loosen up on things like that,” another user said.

Whether this can actually happen or if Anthropic will stick to its guns to preserve a user base in the EU and Canada is yet to be seen.

It’s no surprise to conclude that the noose is only tightening around big tech companies, and Claude seems to be the actual first in a long line of victims of over-regulation.

The post EU Rules Cause Global Rumbles for Claude, More to Follow? appeared first on AIM.