info@ismena.com
Ismena websiteIsmena websiteIsmena websiteIsmena website
  • Home
  • About us
  • Technologies
    • Cloud Services
      • Google Cloud Platform
        • Networking
        • Compute
        • Storage
        • SAP on GCP
        • Google Maps
        • Data Center Modernization
    • Infrastructure
      • iSolution Services
      • Unified Communication
      • Network Security
      • Access Security & Control
      • Computing Platforms
      • Structured Cabling Infrastructure
      • Datacenter Infrastructure
      • Networking Infrastructure
      • Retail Analytics
      • Cloud Infrastructure
    • Integration
      • Apigee
      • IBM
      • Custom Connectors
    • Security
      • Security Consulting Services
      • Security Solutions
    • Data & AI
      • BigQuery, Looker
      • Gemini
    • Collaboration Tools
      • Google Workspace For Enterprise
    • ERP-CRM
      • Odoo
      • Salesforce
      • SAP on GCP
    • DevOps
      • GCP
      • SonarSource
    • Managed Service Provider
      • Managed Service Provider
    • App Development
      • App Development
    • Open Banking
      • Open banking
    • Chrome Devices
  • Unplugged Podcast
  • Blog
    • Success Stories
    • News
    • Articles
  • Careers
  • Contact Us

Technologies

Integration

Custom Connectors

Explore All Connectors

Royal Commission for Riyadh City (RCRC) Open Data Portal Connector

Royal Commission for Riyadh City (RCRC) Open Data Portal Connector

Connector Details

Type

Virtual machines, Single VM , BYOL

Runs on

Google Compute Engine

Last Update

24 October, 2024

Category

Overview

Documentation

Pricing

Support

Overview

The Royal Commission for Riyadh City (RCRC) Open Data Portal connector v2.1 is a RESTful interface for accessing datasets related to the Riyadh Region, encompassing environmental, administrative, urban planning, and demographic data. Built on the Opendatasoft platform, it supports querying, filtering, and exporting datasets using the Opendatasoft Query Language (ODSQL). The API delivers data in JSON and multiple export formats, enabling applications for data analysis, visualization, and urban planning within a GCP custom connector. Most endpoints do not require authentication, but an optional API key may be needed for restricted endpoints or premium features.

Integration Overview

This document details each integration point of the RCRC Open Data Portal API v2.1, including purpose, configuration, and supported workflows for a GCP custom connector. The API facilitates querying datasets, retrieving records, exporting data, and exploring facets, focusing on Riyadh-specific data such as environmental conditions, city infrastructure, and administrative boundaries. Authentication via an API key is optional but may be required for specific endpoints with restrictions or higher usage quotas.

Supported Integration Action Points

  • getDatasets: Retrieves a list of datasets in the Riyadh Region catalog.
  • listExportFormats: Enumerates available export formats for the catalog or datasets.
  • exportDatasets: Exports the catalog in a specified format (e.g., CSV, JSON, DCAT).
  • exportCatalogCSV: Exports the catalog in CSV format with customizable parameters.
  • exportCatalogDCAT: Exports the catalog in RDF/XML (DCAT) format with regional variants.
  • getDatasetsFacets: Retrieves facet values for datasets to enable guided navigation.
  • getRecords: Queries records within a specific dataset.
  • listDatasetExportFormats: Lists export formats for a specific dataset.
  • exportRecords: Exports dataset records in a specified format (e.g., CSV, GeoJSON, Parquet).
  • exportRecordsCSV: Exports dataset records in CSV format with customizable delimiters.
  • exportRecordsParquet: Exports dataset records in Parquet format with compression options.
  • exportRecordsGPX: Exports dataset records in GPX format for geospatial data.
  • getDataset: Retrieves metadata and endpoints for a specific dataset.
  • getRecordsFacets: Enumerates facet values for dataset records.
  • getDatasetAttachments: Lists attachments associated with a dataset.
  • getRecord: Retrieves a single record by its identifier.

Detailed Integration Documentation

Datasets Retrieval

Action getDatasets
Purpose Fetches a list of datasets in the Riyadh Region catalog, including metadata such as dataset identifiers, fields, and features. Serves as the primary entry point for dataset discovery.
Parameters
  • Required:
    • None
  • Optional:
    • apiKey: API key for authentication, if required for restricted datasets (string, e.g., YOUR-API-KEY).
    • select: Fields to include in the response (string, e.g., dataset_id, fields).
    • where: ODSQL filter expression (string, e.g., publisher="Opendatasoft").
    • order_by: Sorting criteria (string, e.g., modified desc).
    • limit: Number of datasets to return (integer, default: 10, maximum: 100).
    • offset: Starting index for pagination (integer, default: 0).
    • refine: Facet filter (string, e.g., refine=theme:Environment).
    • exclude: Facet exclusion filter (string, e.g., exclude=language:fr).
    • lang: Language for formatting strings (string, e.g., en, default: fr).
    • timezone: Timezone for datetime fields (string, e.g., Asia/Riyadh, default: UTC).
    • group_by: Grouping expression (string, e.g., theme).
    • include_links: Include HATEOAS links in the response (boolean, default: false).
    • include_app_metas: Include application metadata (boolean, default: false).
Configuration
  • Configure the GCP custom connector with the API endpoint.
  • Include apiKey in the query string only for restricted endpoints.
  • Ensure the connector handles JSON responses and ODSQL query parameters.
Output
  • Successful: JSON object containing:
    • total_count: Total number of datasets (integer, e.g., 19).
    • links: Array of HATEOAS links (if enabled).
    • results: Array of dataset objects with properties like dataset_id, fields, and metas.
  • Failure: JSON object with error details (e.g., error_code: ODSQLError, message: "Invalid ODSQL query").
Workflow Example
  • Configure the connector with the API endpoint, including apiKey if accessing restricted datasets.
  • Execute getDatasets with refine=theme:Environment to filter environmental datasets.
  • Parse the response in a GCP Dataflow pipeline to extract dataset metadata for further queries.

Catalog Export Formats

Action listExportFormats
Purpose Enumerates supported export formats for the catalog (e.g., CSV, JSON, DCAT), enabling selection of appropriate formats for data extraction.
Parameters
  • Required:
    • None
  • Optional:
    • apiKey: API key for authentication, if required for restricted formats (string, e.g., YOUR-API-KEY).
Configuration
  • Configure the GCP connector with the API endpoint.
  • Include apiKey for restricted endpoints.
Output
  • Successful: JSON object with:
    • links: Array of HATEOAS links to export endpoints.
  • Failure: JSON object with error details (e.g., 401 Unauthorized).
Workflow Example
  • Execute listExportFormats in the GCP connector, including apiKey if needed.
  • Parse the response to identify supported formats (e.g., csv, json, dcat).
  • Store format options in Google Cloud Storage for subsequent export actions.

Catalog Export

Action exportDatasets
Purpose Exports the entire catalog in a specified format, supporting bulk data retrieval for analysis or archival within GCP workflows.
Parameters
  • Required:
    • format: Export format (string, e.g., csv, json, dcat).
  • Optional:
    • apiKey: API key for authentication, if required for restricted formats (string, e.g., YOUR-API-KEY).
    • select: Fields to include in the export (string, e.g., dataset_id, fields).
    • where: ODSQL filter expression (string, e.g., publisher="Opendatasoft").
    • order_by: Sorting criteria (string, e.g., modified desc).
    • limit: Number of datasets to export (integer, default: 10, maximum: 100).
    • offset: Starting index for pagination (integer, default: 0).
    • refine: Facet filter (string, e.g., refine=theme:Environment).
    • exclude: Facet exclusion filter (string, e.g., exclude=language:fr).
    • lang: Language for formatting strings (string, e.g., en, default: fr).
    • timezone: Timezone for datetime fields (string, e.g., Asia/Riyadh, default: UTC).
    • group_by: Grouping expression (string, e.g., theme).
Configuration
  • Configure the connector to handle file downloads.
  • Include apiKey for restricted endpoints.
Output
  • Successful: File in the specified format (e.g., CSV, JSON).
  • Failure: JSON object with error details (e.g., 400 Bad Request).
Workflow Example
  • Execute exportDatasets with format=csv and apiKey if required.
  • Store the exported file in Google Cloud Storage for offline analysis.
  • Process the data in BigQuery for Riyadh environmental or urban metrics reporting.

Catalog CSV Export

Action exportCatalogCSV
Purpose Exports the catalog in CSV format with customizable delimiter and encoding options, optimized for integration with GCP data processing tools like BigQuery.
Parameters
  • Required:
    • None
  • Optional:
    • apiKey: API key for authentication, if required for restricted exports (string, e.g., YOUR-API-KEY).
    • delimiter: Field delimiter (string, e.g., ;, default: ;).
    • list_separator: Separator for multivalued fields (string, default: ,).
    • quote_all: Quote all string values (boolean, default: false).
    • with_bom: Include Unicode Byte Order Mark (boolean, default: true).
Configuration
  • Configure the connector to handle CSV file downloads.
  • Include apiKey for restricted endpoints.
Output
  • Successful: CSV file with catalog data.
  • Failure: JSON object with error details (e.g., 429 Too Many Requests).
Workflow Example
  • Execute exportCatalogCSV with delimiter=; and with_bom=true, including apiKey if needed.
  • Upload the CSV to Google Cloud Storage or BigQuery for analysis.
  • Analyze Riyadh-specific dataset metadata for environmental trends.

Catalog DCAT Export

Action exportCatalogDCAT
Purpose Exports the catalog in RDF/XML (DCAT) format with regional variants, suitable for semantic web applications within GCP. May require authentication for certain variants.
Parameters
  • Required:
    • dcat_ap_format: DCAT variant (string, e.g., _ap_ch, _ap_de).
  • Optional:
    • apiKey: API key for authentication, if required for restricted variants (string, e.g., YOUR-API-KEY).
    • include_exports: Restrict export formats (string, e.g., csv,json).
    • use_labels_in_exports: Use field labels instead of names (boolean, default: true).
Configuration
  • Configure the connector for RDF/XML parsing.
  • Include apiKey for restricted endpoints.
  • Verify subscription for DCAT variants.
Output
  • Successful: RDF/XML file with catalog data.
  • Failure: JSON object with error details (e.g., 403 Forbidden).
Workflow Example
  • Verify subscription for DCAT support in the GCP connector.
  • Execute exportCatalogDCAT with dcat_ap_format=_ap_ch and apiKey if required.
  • Process the RDF/XML file in a GCP semantic data pipeline.

Datasets Facets Retrieval

Action getDatasetsFacets
Purpose Retrieves facet values for datasets (e.g., publisher, theme) to support guided navigation and filtering within GCP applications.
Parameters
  • Required:
    • None
  • Optional:
    • apiKey: API key for authentication, if required for restricted facets (string, e.g., YOUR-API-KEY).
    • facet: Facet field to enumerate (string, e.g., publisher).
    • refine: Facet filter (string, e.g., refine=theme:Environment).
    • exclude: Facet exclusion filter (string, e.g., exclude=language:fr).
    • where: ODSQL filter expression (string, e.g., publisher="Opendatasoft").
    • timezone: Timezone for datetime fields (string, e.g., Asia/Riyadh, default: UTC).
Configuration
  • Configure the connector to parse JSON responses.
  • Include apiKey for restricted endpoints.
Output
  • Successful: JSON object with:
    • links: Array of HATEOAS links.
    • facets: Array of facet objects (e.g., name: publisher, facets: [{name: "Opendatasoft", count: 2}]).
  • Failure: JSON object with error details (e.g., 400 Bad Request).
Workflow Example
  • Execute getDatasetsFacets with facet=theme and apiKey if needed.
  • Parse facet values to build a navigation interface for environmental datasets in a GCP application.
  • Use refine=theme:Environment for targeted dataset filtering.

Records Retrieval

Action getRecords
Purpose Queries records within a specific dataset, enabling detailed exploration of Riyadh-specific data (e.g., environmental or demographic records).
Parameters
  • Required:
    • dataset_id: Dataset identifier (string, e.g., world-administrative-boundaries).
  • Optional:
    • apiKey: API key for authentication, if required for restricted datasets (string, e.g., YOUR-API-KEY).
    • select: Fields to include in the response (string, e.g., name, coordinates).
    • where: ODSQL filter expression (string, e.g., cou_name_en="Saudi Arabia").
    • group_by: Grouping expression (string, e.g., region).
    • order_by: Sorting criteria (string, e.g., name asc).
    • limit: Number of records to return (integer, default: 10, maximum: 100).
    • offset: Starting index for pagination (integer, default: 0).
    • refine: Facet filter (string, e.g., refine=cou_name_en:Saudi Arabia).
    • exclude: Facet exclusion filter (string, e.g., exclude=language:fr).
    • lang: Language for formatting strings (string, e.g., en, default: fr).
    • timezone: Timezone for datetime fields (string, e.g., Asia/Riyadh, default: UTC).
    • include_links: Include HATEOAS links in the response (boolean, default: false).
    • include_app_metas: Include application metadata (boolean, default: false).
Configuration
  • Configure the connector to handle JSON responses and ODSQL queries.
  • Include apiKey for restricted endpoints.
Output
  • Successful: JSON object with:
    • total_count: Total number of records (integer, e.g., 137611).
    • links: Array of HATEOAS links.
    • results: Array of record objects (e.g., _id, _timestamp, fields like name, coordinates).
  • Failure: JSON object with error details (e.g., 400 Bad Request).
Workflow Example
  • Execute getRecords with dataset_id=geonames-all-cities, refine=cou_name_en:Saudi Arabia, and apiKey if needed.
  • Process records in a GCP Dataflow pipeline to analyze Riyadh’s population or environmental data.
  • Store results in BigQuery for further querying.

Dataset Export

Action exportRecords
Purpose Exports dataset records in a specified format, supporting large-scale data retrieval for GCP-based analysis or visualization.
Parameters
  • Required:
    • dataset_id: Dataset identifier (string).
    • format: Export format (string, e.g., csv, geojson).
  • Optional:
    • apiKey: API key for authentication, if required for restricted formats (string, e.g., YOUR-API-KEY).
    • select: Fields to include in the response (string).
    • where: ODSQL filter expression (string).
    • order_by: Sorting criteria (string).
    • group_by: Grouping expression (string).
    • limit: Number of records to return (integer).
    • refine: Facet filter (string).
    • exclude: Facet exclusion filter (string).
    • lang: Language for formatting strings (string, e.g., en).
    • timezone: Timezone for datetime fields (string, e.g., Asia/Riyadh).
    • use_labels: Use field labels instead of names (boolean, default: false).
    • compressed: Enable GZIP compression (boolean, default: false).
    • epsg: Spatial projection code (integer, default: 4326).
Configuration
  • Configure the connector to handle file downloads.
  • Include apiKey for restricted endpoints.
Output
  • Successful: File in the specified format (e.g., GeoJSON, CSV).
  • Failure: JSON object with error details (e.g., 429 Too Many Requests).
Workflow Example
  • Execute exportRecords with dataset_id=geonames-all-cities, format=geojson, and apiKey if needed.
  • Store the GeoJSON file in Google Cloud Storage for geospatial analysis.
  • Visualize Riyadh’s urban data using Google Cloud’s Data Studio.

Dataset CSV Export

Action exportRecordsCSV
Purpose Exports dataset records in CSV format with customizable options, optimized for integration with GCP tools like BigQuery or Dataflow.
Parameters
  • Required:
    • dataset_id: Dataset identifier (string).
  • Optional:
    • apiKey: API key for authentication, if required for restricted exports (string, e.g., YOUR-API-KEY).
    • delimiter: Field delimiter (string, e.g., ;, default: ;).
    • list_separator: Separator for multivalued fields (string, default: ,).
    • quote_all: Quote all string values (boolean, default: false).
    • with_bom: Include Unicode Byte Order Mark (boolean, default: true).
Configuration
  • Configure the connector to handle CSV downloads.
  • Include apiKey for restricted endpoints.
Output
  • Successful: CSV file with dataset records.
  • Failure: JSON object with error details (e.g., 400 Bad Request).
Workflow Example
  • Execute exportRecordsCSV with dataset_id=geonames-all-cities, delimiter=;, and apiKey if needed.
  • Upload the CSV to BigQuery for environmental data analysis.
  • Generate reports on Riyadh-specific metrics.

Dataset Parquet Export

Action exportRecordsParquet
Purpose Exports dataset records in Parquet format, optimized for big data processing in GCP environments like Dataproc.
Parameters
  • Required:
    • dataset_id: Dataset identifier (string).
  • Optional:
    • apiKey: API key for authentication, if required for restricted exports (string, e.g., YOUR-API-KEY).
    • parquet_compression: Compression algorithm (string, e.g., snappy, default: snappy).
Configuration
  • Configure the connector to handle Parquet file downloads.
  • Include apiKey for restricted endpoints.
Output
  • Successful: Parquet file with dataset records.
  • Failure: JSON object with error details (e.g., 400 Bad Request).
Workflow Example
  • Execute exportRecordsParquet with dataset_id=geonames-all-cities, parquet_compression=zstd, and apiKey if needed.
  • Process the Parquet file in GCP Dataproc for urban analysis.
  • Store results in BigQuery for querying.

Dataset GPX Export

Action: exportRecordsGPX Purpose: Exports dataset records in GPX format, suitable for geospatial applications in GCP, such as mapping Riyadh’s environmental features. Parameters Required: dataset_id: Dataset identifier (string). Optional: apiKey: API key for authentication, if required for restricted exports (string, e.g., YOUR-API-KEY). name_field: Field used for GPX name attribute (string). description_field_list: Fields for GPX description attribute (string). use_extension: Use tag for attributes (boolean, default: true). Configuration: Configure the connector to handle GPX file downloads. Include apiKey for restricted endpoints. Output Successful: GPX file with geospatial data. Failure: JSON object with error details (e.g., 400 Bad Request). Workflow Example Execute exportRecordsGPX with dataset_id=geonames-all-cities, name_field=name, and apiKey if needed. Store the GPX file in Google Cloud Storage. Import into a GIS tool on GCP for mapping Riyadh’s points of interest.

Dataset Information Retrieval

Action getDataset
Purpose Retrieves metadata and endpoints for a specific dataset, enabling structural understanding for GCP workflows.
Parameters
  • Required:
    • dataset_id: Dataset identifier (string).
  • Optional:
    • apiKey: API key for authentication, if required for restricted datasets (string, e.g., YOUR-API-KEY).
    • select: Fields to include in the response (string).
    • lang: Language for formatting strings (string, e.g., en, default: fr).
    • timezone: Timezone for datetime fields (string, e.g., Asia/Riyadh, default: UTC).
    • include_links: Include HATEOAS links in the response (boolean, default: false).
    • include_app_metas: Include application metadata (boolean, default: false).
Configuration
  • Configure the connector to parse JSON metadata.
  • Include apiKey for restricted endpoints.
Output
  • Successful: JSON object with dataset metadata (e.g., dataset_id, fields, metas, attachments).
  • Failure: JSON object with error details (e.g., 400 Bad Request).
Workflow Example
  • Execute getDataset with dataset_id=geonames-all-cities and apiKey if needed.
  • Parse metadata to identify fields like coordinates or population.
  • Plan subsequent record queries or exports in the GCP connector.

Records Facets Retrieval

Action getRecordsFacets
Purpose Enumerates facet values for dataset records (e.g., country, timezone), enabling guided navigation in GCP applications.
Parameters
  • Required:
    • dataset_id: Dataset identifier (string).
  • Optional:
    • apiKey: API key for authentication, if required for restricted facets (string, e.g., YOUR-API-KEY).
    • facet: Facet field to enumerate (string, e.g., cou_name_en).
    • where: ODSQL filter expression (string).
    • refine: Facet filter (string).
    • exclude: Facet exclusion filter (string).
    • lang: Language for formatting strings (string, e.g., en, default: fr).
    • timezone: Timezone for datetime fields (string, e.g., Asia/Riyadh, default: UTC).
Configuration
  • Configure the connector to parse JSON facet responses.
  • Include apiKey for restricted endpoints.
Output
  • Successful: JSON object with:
    • links: Array of HATEOAS links.
    • facets: Array of facet objects (e.g., name: cou_name_en, facets: [{value: "Saudi Arabia", count: 313}]).
  • Failure: JSON object with error details (e.g., 400 Bad Request).
Workflow Example
  • Execute getRecordsFacets with dataset_id=geonames-all-cities, facet=cou_name_en, and apiKey if needed.
  • Use facet values in a GCP application to filter records for Saudi Arabia.
  • Build a navigation interface for Riyadh-specific data.

Dataset Attachments Retrieval

Action getDatasetAttachments
Purpose Lists attachments (e.g., ZIP files) associated with a dataset, providing supplementary data for GCP workflows.
Parameters
  • Required:
    • dataset_id: Dataset identifier (string).
  • Optional:
    • apiKey: API key for authentication, if required for restricted attachments (string, e.g., YOUR-API-KEY).
Configuration
  • Configure the connector to parse JSON attachment lists.
  • Include apiKey for restricted endpoints.
Output
  • Successful: JSON object with:
    • links: Array of HATEOAS links.
    • attachments: Array of attachment objects (e.g., href, metas: {mime-type: "application/zip"}).
  • Failure: JSON object with error details (e.g., 400 Bad Request).
Workflow Example
  • Execute getDatasetAttachments with dataset_id=geonames-all-cities and apiKey if needed.
  • Download attachments like cities1000.zip to Google Cloud Storage.
  • Process supplementary data for offline analysis in GCP.

Single Record Retrieval

Action getRecord
Purpose Retrieves a single record by its identifier, enabling detailed inspection of specific data points in GCP applications.
Parameters
  • Required:
    • dataset_id: Dataset identifier (string).
    • record_id: Record identifier (string, e.g., 2978771).
  • Optional:
    • apiKey: API key for authentication, if required for restricted records (string, e.g., YOUR-API-KEY).
    • select: Fields to include in the response (string).
    • lang: Language for formatting strings (string, e.g., en, default: fr).
    • timezone: Timezone for datetime fields (string, e.g., Asia/Riyadh, default: UTC).
Configuration
  • Configure the connector to parse JSON record data.
  • Include apiKey for restricted endpoints.
Output
  • Successful: JSON object with record data (e.g., _id, name, coordinates).
  • Failure: JSON object with error details (e.g., 404 Not Found).
Workflow Example
  • Execute getRecord with dataset_id=geonames-all-cities, record_id=2978771, and apiKey if needed.
  • Parse record details for a specific Riyadh location in a GCP application.
  • Use data for targeted analysis or visualization in Data Studio.

Workflow Creation with the Connector

Example Workflow: Analyzing Riyadh’s Environmental and Urban Data in GCP

Retrieve Available Datasets

  • Execute getDatasets with refine=theme:Environment to fetch environmental datasets.
  • Store dataset metadata in Google Cloud Storage for further processing.

Explore Dataset Facets

  • Execute getDatasetsFacets with facet=publisher to identify dataset providers, including apiKey if required.
  • Use facets to filter Riyadh-specific sources in a GCP application.

Query Dataset Records

  • Execute getRecords with dataset_id=geonames-all-cities, refine=cou_name_en:Saudi Arabia, and apiKey if needed.
  • Process records in a GCP Dataflow pipeline to analyze Riyadh’s population or environmental data.
  • Store results in BigQuery for querying.

Export Data for Analysis

  • Execute exportRecordsCSV with dataset_id=geonames-all-cities, delimiter=;, and apiKey if needed.
  • Upload the CSV to BigQuery for environmental or urban trend analysis.

Visualize Geospatial Data

  • Execute exportRecordsGPX with dataset_id=geonames-all-cities, name_field=name, and apiKey if needed.
  • Store the GPX file in Google Cloud Storage and import into a GIS tool for mapping Riyadh’s environmental features.

Access Supplementary Data

  • Execute getDatasetAttachments with dataset_id=geonames-all-cities and apiKey if needed.
  • Download attachments to Google Cloud Storage for additional Riyadh-specific insights.

Pricing

Request a Quote

Support

For Technical support please contact us on

custom-connectors-support@isolutions.sa

iSolution logo - white - transparent 250 px

iSolution logo - white - transparent 250 px

A tech solution company dedicated to providing innovation thus empowering businesses to thrive in the digital age.

  • Home
  • About us
  • Blog
  • Careers
  • Success Stories
  • News
  • Articles
  • Contact Us
  • Terms and conditions
  • Privacy Policy
© Copyright 2024 iSolution | All Rights Reserved
  • Home
  • About us
  • Technologies
    • Cloud Services
      • Google Cloud Platform
        • Networking
        • Compute
        • Storage
        • SAP on GCP
        • Google Maps
        • Data Center Modernization
    • Infrastructure
      • iSolution Services
      • Unified Communication
      • Network Security
      • Access Security & Control
      • Computing Platforms
      • Structured Cabling Infrastructure
      • Datacenter Infrastructure
      • Networking Infrastructure
      • Retail Analytics
      • Cloud Infrastructure
    • Integration
      • Apigee
      • IBM
      • Custom Connectors
    • Security
      • Security Consulting Services
      • Security Solutions
    • Data & AI
      • BigQuery, Looker
      • Gemini
    • Collaboration Tools
      • Google Workspace For Enterprise
    • ERP-CRM
      • Odoo
      • Salesforce
      • SAP on GCP
    • DevOps
      • GCP
      • SonarSource
    • Managed Service Provider
      • Managed Service Provider
    • App Development
      • App Development
    • Open Banking
      • Open banking
    • Chrome Devices
  • Unplugged Podcast
  • Blog
    • Success Stories
    • News
    • Articles
  • Careers
  • Contact Us
Ismena website

Register To Palo Alto & iSolution Event

Register to IBM x iSolution Event

Register to Gemini in Action Workshop

[forminator_form id=”14485″]

Registration To Amman Unplugged Event

[forminator_form id=”14419″]

Register to Gemini in Action Workshop

[forminator_form id=”14298″]

Tech and Culture Riyadh

[forminator_form id=”13094″]