A style transfer aims to produce synthesized images that retain the content of one image while adopting the artistic style of another. Traditional style transfer methods often require training separate transformation networks for each new style, limiting their adaptability and scalability. To address this challenge, we propose a flow-based image style transfer framework that integrates Randomized Hierarchy Flow (RH Flow) and a meta network for adaptive parameter generation. The meta network dynamically produces the RH Flow parameters conditioned on the style image, enabling efficient and flexible style adaptation without retraining for new styles. RH Flow enhances feature interaction by introducing a random permutation of the feature sub-blocks before hierarchical coupling, promoting diverse and expressive stylization while preserving the content structure. Our experimental results demonstrate that Meta FIST achieves superior content retention, style fidelity, and adaptability compared to existing approaches.