Historically, companies tried to solve the data-access problem with point-to-point integrations and centralized data hubs. Unfortunately, the point-to-point approach creates a messy integration spaghetti where costs and difficulties increase exponentially. Data hub or centralized control and data monitoring takes this into account but suffers from the siloed and ever-changing nature of enterprise data.
A data fabric is an architectural approach to data that tries to take the nature of data into account. Data is everywhere and changes a lot. In Data Fabric, a set of data services provide consistent capabilities across various endpoints spanning hybrid multi-cloud environments. It is an architecture that unifies data management practices and practicalities across Cloud, on-premises, and edge devices. For example, a business API layer in data fabric could contain GetCustomer API that fetches an enterprise-wide agreed set of customer data. GetProductionData API would be in the same format and structure, yet it could run on different cloud providers and API hosting platforms.
An iPaaS with multi-cloud support is a perfect platform to build data fabric services. Accordingly, due to the nature of the data fabric concept, you should not restrict the tooling but the architecture and rules of how data is provided and managed.